Pi Wars ‘Somewhere over the rainbow’ part one


Somewhere over the rainbow is one of the Pi Wars Autonomous challenges, and in my opinion the hardest, if you are attempting the challenge using the method 1. The most points are available for attempting the challenge using method 1. For teams attempting this challenge using method 2, a large hint for you, look up the maze solving “left hand” rule. ;)

http://piwars.org/2018-competition/challenges/somewhere-over-the-rainbow/

After the fun and games getting OpenCV to install, I looked at modifying the PiBorg script, to work with different colours, after modifying/playing with the code. I concluded that this was not going to work for me. My requirements for my ‘Somewhere Over The Rainbow’ challenge are as follows

1) Had to be easy/quick to learn new colours
2) Easy for me to program in Python
3) Fast, as in real time

I looked around for answers for the above requirements, and remembered that a few years ago that I backed a Kickstarter for a colour camera sensor, I went searching for “one day, I will use this and justify buying it” box and found it. The Kickstarter was for the Pixy camera, a colour sensor :D



https://www.kickstarter.com/projects/254449872/pixy-cmucam5-a-fast-easy-to-use-vision-sensor

Did this wise investment pay off? Let’s have a look at each requirement in turn

1) You can teach Pixy by pressing a button and showing it the object you wish to sense. This was OK, but a little fiddly. The next option was to install the Pixy Camera software on my laptop. This was more like it, real time image of what the camera was seeing, and you could assign colours to one of the seven colour signatures. The software also lets you change the configuration of the Pixy. Apart from the need of a PC/MAC to use the software, this was a win.
2) a quick look on the Pixy’s wiki, I found there was a python library with examples, and a quick look at an example, and I understood it!
3) All my tests with the Pixy software, it was able to keep up with everything and not missing a beat.

The Pixy meets my requirements for the challenge. :D

Time to mount it on a robot. Currently I have not started building my Pi Wars robot, as I have not finished design. However, I do have access to one of our robot kits, I decided to use a ‘East coast customs’ Robot in Atomic Green. I designed a plate to mount the camera to the front of ‘Rover’ he is called Rover because he chases balls, which I cut out on the next visit to Cambridge’s Hack/Make space Makespace.

 Ball !!!!!

The next step was to get Rover to move when the Pixy saw a learnt colour. This was very easy! And was completed in a couple lines and a few minutes.
Over the next days the code evolved until Rover was able to turn to centre the ball and move forward until the ball was a set size.
The Pixy camera made this easy for me, each learnt colour it finds is outputted as a block, with the following information.

The colour signature number, it’s location and size.

With this information, the code can turn the robot towards the colour it is hunting, centre and move forward until the ball is in the centre and correct distance away from Rover. I have tested the code with a ball just over the max distance between balls, just over 175 cm. The ball at that distance is only a few pixels wide!

long test


if blocks[index].signature == 4 and (blocks[index].width > 5 and blocks[index].height > 5):
ball = True

the above line checks that found signature is the required signature and meets the size requirements and sets the flag ball to true, if true.
the following line sets the position variable with the x coordinate of the colour object

position = blocks[index].x

The next block of code sets the motors speeds, depending on the position of colour signature.

if ball:
print('Ball!')
position = blocks[index].x
if position > 165:
print("**** right ****")
power_right = -75
power_left = 100
elif position < 125:
print("**** left ****")
power_right = 100
power_left = -75
else:
print("**** forward ****")
power_right = 100
power_left = 100

Conclusion

As you can see the use of the Pixy has a made the hard task of identifying a ball and its position like using any other simple sensor and has improved my odds of finishing this challenge. The Pixy is available to buy from around £60. This may seem a lot of money, but for the amount of headache, it has and will saved is worth every penny. My current code is below, be free to use it.

Good Luck

Brian

from pixy import *
from ctypes import *
from explorerhat import motor

# Pixy Python SWIG get blocks example #

print ("Pixy Python SWIG Example -- Get Blocks")

# Initialize Pixy Interpreter thread #
pixy_init()
print("START")

class Blocks (Structure):
_fields_ = [ ("type", c_uint),
("signature", c_uint),
("x", c_uint),
("y", c_uint),
("width", c_uint),
("height", c_uint),
("angle", c_uint) ]

blocks = BlockArray(100)
frame = 0

# Wait for blocks #
while True:

ball = False
ballwidth = 0
ballheight = 0
count = pixy_get_blocks(10, blocks)
if count > 0:
# Blocks found #
print 'frame %3d:' % (frame)
frame = frame + 1
for index in range (0, count):
print '[BLOCK_TYPE=%d SIG=%d X=%3d Y=%3d WIDTH=%3d HEIGHT=%3d]' % (blocks[index].type, blocks[index].signature, blocks[index].x, blocks[index].y, blocks[index].width, blocks[index].height)
if blocks[index].signature == 4 and (blocks[index].width > 5 and blocks[index].height > 5):
ball = True
ballwidth = blocks[index].width
ballheight = blocks[index].height

if ball:

print('Ball!')
position = blocks[index].x
if position > 165:
print("**** right ****")
power_right = -75
power_left = 100
elif position < 125:
print("**** left ****")
power_right = 100
power_left = -75
else:
print("**** forward ****")
power_right = 100
power_left = 100

print (position)


if ballwidth > 100:
print("**** STOP ****")
power_left = 0
power_right = 0

print("left: " + str(power_left) + " right: " + str(power_right))

motor.one.speed(-power_left)
motor.two.speed(power_right)
else:
motor.one.speed(0)
motor.two.speed(0)







 


Leave a comment

Please note, comments must be approved before they are published