Tuesday 10 March 2015

Agobo The Hackable Raspberry Pi Robot - now with emotions





My wife gave me a 4Tronix AgoBo robot kit for Christmas (at my request). I built it a few weeks ago but didn't really have time to do anything with it.

The AgoBo is a Raspberry Pi A+ based robot kit. I also ordered the Plus plate that adds a Neopixel and lots of prototyping space on a board that mounts above the RPi. The kit is a really good affordable robot kit that can be customised very easily, especially with the PlusPlate. It is this customisation that really attracted me to the AgoBo in the first place.

When the robot arrived I thought that the Ultrasonic sensor looked like a pair of eyes but AgoBo was lacking a mouth. On another evening I was rooting through a box of electronic bits I bought for RPi projects and found an 8x8  LED matrix. 


I had seen plenty of robot that used these as eyes and thought that this could work. However with the robot being so small the matrix was far too large. I had another dig in the box and found a more suitably sized replacement.


The 5011AS display fitted just below the ultra sonic sensor with the pins above and below the main board. Aligned horizontally the segments could be used to make a smile or sad face by lighting the correct segments.

This idea was put on the back burner for a couple of weeks whilst normal life got in the way. and I kept thinking about how to mount the module effectively under the board. When I was able to experiment with the robot again (finally loaded the example software and tried out the supplied python scripts) I couldn't resist having a try with the mouth idea.

I haven't found time to solder the header on to the plus plate yet and wanted to get the mouth working so I grabbed a breadboard and some cables to try it out before I sorted it all out properly.



I had a ten cable female to female ribbon so I divided that into two (5 cables in each) to connect the ten pins of the display. With the ends of the cable connected there was very little room between the pins but with a little blue tack the display mounted nicely with two pins each side below the board and three above. To keep things tidy I separated the first part of the cable for a short length and then wrapped the cable up over RPi and under the PlusPlate (with a little Blue Tack of course).







I then grabbed a few resistors and connected the cables to the breadboard and then connected the other side to the header I fitted to the main board (in preparation for connecting the plus plate).


This is where I ran into my first problem.limited time and a failure to read instructions lead to an error in the connections. Instead of looking at the instructions I looked at the numbers on the top of the PlusPlate and reading down from the top started used the first available pins. Unfortunately these pins are already in use by AgoBo so there was a bit of a conflict when I tried to use these to run the mouth.

So looking back at the instructions I made a list of the pins that were in use and looked again at the PlusPlate for available pins and moved the connections to pins that were not already in use by AgoBo.

Once I had the connections all set up (correctly this time) I needed to set up- the code to run the mouth and control the facial expressions. I decided i wanted a smile (obviously, what's cuter than a smiling robot?) a sad face, a confused face and an open mouth, After this time consulting the instructions (the data sheet from Jameco) I drew a little diagram of which pin controlled which segments of the display and worked out a little table of which should be displayed for each facial expression.



With this organised I set up a Python library (mouth.py) to set up the facial expression and then a quick script to test the expressions. The test script (mouthtest.py) shows each expression I have set up so far. the smile, sad face and 'oh' i am really pleased with. I am not to sure about the confused face so I may not use that very much. These scripts will be available from my AgoBo Github fork here.


.



With the mouth working I wanted to work the expressions in to the normal running program for Agobo. I had written a quick script previously for him to avoid objects using the ultra sonic sensor so I used this as a starting point.

I ran into a small issue here as I had set up the mouth library using GPIO numbers and the AgoBo library is set up using board numbers. after a little head scratching (I am still unsure why in an error state the face always seems to display 'oh') I spotted the error and changed the mouth library to match the python library and now Agobo will avoid objects whilst displaying his emotions.

Currently he is happy moving forward until he finds an object. This makes him sad and he turns 90 degrees. If he can move forward he is happy again. If instead there is another object in his path he is shocked / cross ('oh') and turns 180 degrees. Again if the way he clear he is happy again and proceeds. However if there is another object he becomes confused (or his face does) and then turns 90 degrees (away from the initial object and proceeds on his way happy again.




No comments:

Post a Comment