Inspiring, Machine learning, Raspberry Pi, Robotics

Raspberry Pi Spotter



What is it?

Rapberry Pi Spotter basically recognizes the features in a given image. It analyses the images taken from a webcam connected to the Raspberry Pi and tells us what is in the image. Please see the video of jetpac in my previous post.

Why it is important?

The importance of this software is that, it does all this image analysis in offline. One might wonder how come the Raspberry Pi with little CPU power can do this. As I have explained in the previous post, Pi does it through its powerful GPU.

I thought it is really cool, if I can implement it on a simple Robotic platform. The robot goes around and keep  informing us what it sees in its surroundings, just like in the video shown by Jetpac.

What did I do ?


So I have constructed a basic robot with an RC car, and connected a webcam to Pi (The construction details and better pics will be detailed in the next post).  Then  I have Installed the webiopi to get the video stream. Then I have implemented the browser interface as shown  in the above pic for the Raspberry Pi.  Now the Robot can be controlled from any device with a browser. I will upload the  demonstration video soon.

How does it work?

The arrow buttons on the interface control the Robot’s movement. In the above, the video is streamed just like the cambot developed by Eric / trouch. The yellow button above the video stream is to implement the deepbelief software.

When we click on this spotter button, webiopi will search for the latest image captured by the webcam and deepbelief will find the features in that image. Then it displays the results in the text box above the spotter button.

How to install it?

In my previous post, I have explained how to install the deepbelief network on Raspberry Pi. You can find the instructions to install the webiopi here.

I have uploaded my code into github. Note, that the code is really very rough at the moment. This code will work, even if one does not have a Robot. In that case the arrow buttons do nothing. However, video feed will be analyzed as in the original video of jetpac.

First, one has to download the files of and index.html into a folder, where the deep belief software was installed previously. We also need to configure the settings of motion, so that it will stream the images directly into that folder. I will write detailed instructions with videos when I get more time.


Results are not that great.  One reason might be I am not looking at real life objects. Anyway, I am not concerned about the quality of results yet.  Also, my coding skills are very limited. I hope people with better knowledge of python, in particular webiopi and deepbelief can make it work even better. One concern is that my Pi has only 256 MB of RAM. So it usually keep freezing under load. May be 512 MB RAM Pi will give a smoother experience.


I would like to thank Pete Warden (CTO of Jetpac) and Toshi Bass (in Webiopi google group) for helping me throughout this project. I am also thankful to guys at stackoverflow for their help with the python doubts.



One thought on “Raspberry Pi Spotter

  1. Pingback: Pi jam Write up July 14 | Devon and Cornwall GNU/Linux Users Group

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s