ARDUINO, Raspberry Pi, Robotics

My experiments with stepper motor

Its a different game

At the starting of the project I was thinking that it will be fairly straight forward to hook a NEMA 17 stepper motor to Arduino/Raspberry Pi. After all, I have an experience of connecting a 24byj48 stepper motor to both Raspberry pi and an Arduino, without any fuss. The beauty of this 24byj48 stepper motor is that it comes with its own driver board, all including just for $3 (thanks to China). That gave me confidence, that even if I burn it, I don’t mind.

MOTORS_BLOG

But..

now I have a NEMA 17 motor, which is mostly used in 3D printers and CNC machines. It costs around $40 (when we buy from a well known vendor, not from ebay). Now the problem is choosing a correct driver board, as this motor does not come with its own driver. Moreover, Arduino/Raspberry Pi cannot directly supply the enough juice to run these motors.

Choosing a correct driver board

After stumbling upon few options, I found that Poulou’s A4988 driver is most appropriate (thanks to Google). I know people running NEMA 17 with L293D chips (I used them a lot in  my robotic projects to run DC motors). But I am really worried about the shiny (and costly too) new motor, and I definitely don’t wanna burn my NEMA 17.  So, I finally ordered A4988 driver board.

And I fried it..

I followed an excellent connections map (and also the Arduino code given there) from this post on Arduino forums. Almost in every hacker blog/forum, this diagram is posted. I powered my board with 12V wall adaptor. I was thinking that, I was giving enough power to motors. But motor did not rotate at all. I checked my connections, rechecked…and rechecked.

a4988 diagram

 

First problem:

My 12V wall-adaptor does not output enough current output. I don’t know, how I failed to check this minor detail. Anyhow, I have replaced the power source with a 8 AA battery pack. And it rotated. I rejoiced and wanted permanent connections, instead of messing up everything on  a bread board with lose connections.

Second problem:

When I soldered everything on a prototype board, for some odd reason, which I have no clue even now, motor did not rotate at all. I was frustrated. First sign of failure. Went back to breadboard, abandoning the prototype board. I reconnected the motor pins and exchanged connections, quite a number of times.  I hate my hacker mind for tempting do this.

Because..

There is a small warning that says, we should not plug or unplug a stepper motor, when the driver board is powered. I learnt this after reading few forum posts, carefully. What happens, if you ignore the warning. Simply your driver board will be fried. And it happened to me.

How I solved it:

Simply ordered a new board, which arrived in just 2 days, thanks to a ebay UK supplier.  Now with the new motor driver, I have taken at most care and connected every according to the above circuit diagram. My motor is finally running.

But..

still there is a small problem. Motor is vibrating while rotating. It is no where near to smooth running. Googled again patiently. The solution suggested was that interchange the connections going to the Motor. Now, I have checked spec sheet of the motor, colour code of NEMA 17 wires and read quite a few forums. Result is that, interchanging the motor pin connections simply wont work. Back to square one. I suspected that Arduino code might be a problem.

Yes it is..

I have changed now to an excellent stepper motor library written by good guys at Adafruit. I installed the AcclStepper library and it worked like a charm. Finally, my stepper motor is now running smoothly. After spending almost 4 late nights, I figured it out. I am ashamed to waste so much time on such a simple task. But that is how a newbie scientist learns.

Lessons learned:

  1. Never, plug and unplug a motor connections, when the driver board is powered.
  2. Don’t blindly use a high voltage power source, without looking at its current supply capacity.
  3. Better use 8 AA battery pack, rather than a fancy 9V battery.
  4. If you burned a driver board already, that’s alright.  Don’t waste figuring out the connections. Just order a new board. That will save lots of time and save your sanity and few hairs on your head.

  5. Once after figuring out that your motor is working, better use a AcclStepper library, than the raw Arduino code. It would make your motor and your life easy and smooth.

Codes:

Simple Arduino code to test the running of the stepper motor

//simple A4988 connection
//jumper reset and sleep together
//connect  VDD to Arduino 3.3v or 5v
//connect  GND to Arduino GND (GND near VDD)
//connect  1A and 1B to stepper coil 1
//connect 2A and 2B to stepper coil 2
//connect VMOT to power source (9v battery + term)
//connect GRD to power source (9v battery - term)


int stp = 13;  //connect pin 13 to step
int dir = 12;  // connect pin 12 to dir
int a = 0;     //  gen counter

void setup() 
{                
  pinMode(stp, OUTPUT);
  pinMode(dir, OUTPUT);       
}


void loop() 
{
  if (a <  200)  //sweep 200 step in dir 1
   {
    a++;
    digitalWrite(stp, HIGH);   
    delay(10);               
    digitalWrite(stp, LOW);  
    delay(10);              
   }
  else 
   {
    digitalWrite(dir, HIGH);
    a++;
    digitalWrite(stp, HIGH);  
    delay(10);               
    digitalWrite(stp, LOW);  
    delay(10);
    
    if (a>400)    //sweep 200 in dir 2
     {
      a = 0;
      digitalWrite(dir, LOW);
     }
    }
}

 

Now to get smooth rotation of the motor, I have used the following code, which makes use of AcclStepper library.


#include <AccelStepper.h>

AccelStepper Stepper1(1,13,12); //use pin 12 and 13 for dir and step, 1 is the "external driver" mode (A4988)
int dir = 1; //used to switch direction

void setup() {
  Stepper1.setMaxSpeed(3000); //set max speed the motor will turn (steps/second)
  Stepper1.setAcceleration(13000); //set acceleration (steps/second^2)
}

void loop() {
  if(Stepper1.distanceToGo()==0){ //check if motor has already finished his last move
    Stepper1.move(1600*dir); //set next movement to 1600 steps (if dir is -1 it will move -1600 -> opposite direction)
    dir = dir*(-1); //negate dir to make the next movement go in opposite direction
    delay(1000); //wait 1 second
  }
  
  Stepper1.run(); //run the stepper. this has to be done over and over again to continously move the stepper
}


Credits

http://forum.arduino.cc/index.php?PHPSESSID=c4am9ddu9ol3f8i14vu6o3aul7&topic=133894.msg1449404#msg1449404

http://electronics.stackexchange.com/questions/95238/stepper-motor-rotating-but-vibrating

http://www.linengineering.com/resources/wiring_connections.aspx

https://cdn-reichelt.de/documents/datenblatt/C300/QSH4218_manual.pdf

http://www.robotdigg.com/news/26/Reprap-Stepper-Motor

http://semiengineering.weebly.com/blog/stepper-driven-by-arduino-a4988

https://github.com/adafruit/AccelStepper

Standard
Uncategorized

Raspberry Pi Robot Construction

Today, I am going to share the details of my  Raspberry pi Robot!

I have the following objectives in mind at the beginning of the project:

1) Construction should be simple, inexpensive and should have room for further development

2) It should have  video streaming capabilities.

3) I should be able to control the Robot from any smart device such as mobile, desktop or tablet

4) I should not install any new software on the controlling device

I am pleased that at the end of the project, I am able to achieve all these objectives.

In the following, I will give as many details as possible for the construction of my Raspberry Pi Robot!

Parts:

1) Raspberry Pi

2) SD-card

3) Rechargeable Battery for Pi

4) Rechargeable batteries to drive the Motors

5) RC-car

6) L293D- IC

7) Bread board and some jumper wires

8) Wifi dongle

9) Webcam

 

Before going into details, I would like to show my Robot in action.

Construction:

I disassembled  a old RC-car and removed the original controlling circuit inside of it. I just needed the bottom part of the car with motors and battery holder. This bottom part is our Robot Chassis. RC cars have mostly simple toy DC motors. The front motor is used for the left-right movement while the rear motor is used for the forward-backward movement.

I placed a bread board on the Chassis using foam glue tape. Then I placed raspberry pi on this breadboard at the front end of Chassis. I placed a L293D-Chip on the board at the rear end of the Chassis.

In the gap between the breadboard and front of the Chassis, I placed the rechargeable battery and the video camera. That’s it.

Circuit Connections:

Connections between RC-car and L293D

1) I saw the below video for connections from the L293D to the motors.

2) Basically each half of the chip controls one motor. First DC-motor of the RC-car will be connected to Pin 3 and Pin 6 of L293D. Second Motor will be connected to Pin 12 and   Pin 15 of L293D.

3) Pin 4,Pin 5, Pin 13, Pin 14 of L293D are connected to the ground line of the bread board.

4) The positive wire from the battery holder of the RC-car are connected to pin 16 of L293D, while the ground wire is connected to the ground line of bread board.

Connections between Raspberry Pi and l293D

Please be careful that I am referring here GPIO pin numbers on Pi, not the actual numbers on the board.

L293D PIN2 =GPIO  9
L293D PIN7 =GPIO 10
L293D PIN1 =GPIO 11

L293D PIN10 =GPIO  23
L293D PIN15 =GPIO 24
L293D PIN9 =GPIO 25

L293D PIN16 = 5V GPIO

Finally, connect  ground pin on Raspberry Pi to the ground line of the breadboard.

We may not need to use any solder, just jumper wires will be enough.

Software:

I used the Webiopi since it will provide simple interface for both the video streaming and controlling the motors with just a browser.

I followed the instructions from this blog for installing it. You will see pdf files of Mag Pi issue 9 and 10. I adopted the code, according to my RC-car chassis. Also, for some strange reasons, I am not able to get  my webcam working with mjpg streamer.

So I installed motion, which is very easy to use. I followed the instructions from this tutorial for installing motion.  I attached my files for controlling the RC-car here.

I also set-up the wifi on my Raspberry Pi using this guide.

Working:

Once after installing Webiopi, go to the folder where we downloaded the attached files on the Raspberry Pi. Just hit the following command

sudo python cambot.py

Go to the browser on any device on your network, hit

http: //ip address of raspberrypi: 8001

In the above instead of “ip address of raspberry pi” use the actual ip address of your Pi. You should be able to see the video and four buttons to control the robot.

Some tips:

1) Please update your Raspberry Pi software. It drives me crazy with my old installation of Raspbain OS. I had to reinstall the latest version to make my webcam work.

2)   Increase the frame rates in the motion software or in mjpg software to get better streaming. Refer to this guide.

3) Make sure all the connections are made according to the GPIO numbers on the board, but not the physical pin numbers.

4) If you are trying to build the Robot from scratch, then try to divide the project into parts. Making all softwares  work together on Pi was the difficult step for me.

5) Please save all the settings of Pi, by cloning your SD-card using this guide.

Further improvements:

1) I wanna make it autonomous, by using ultra-sonic sensors

2) Adding Bluetooth control (both from mobile devices and PS3 controller)

3) Use a servo motor to rotate the webcam, so that we can see more area

4) Object tracking using Open CV

5) Using capacitors to have more safe connections.

References:

http://wolfpaulus.com/jounal/embedded/raspberrypi_webcam/

http://pingbin.com/2012/12/raspberry-pi-web-cam-server-motion/

http://rbnrpi.wordpress.com/project-list/setting-up-wireless-motion-detect-cam/

http://trouch.com/2013/03/04/webiopi-in-the-magpi-cambot-tutorial/

http://pdwhomeautomation.blogspot.co.uk/2012/11/raspberry-pi-powered-lego-car.html

http://www.instructables.com/id/Create-an-internet-controlled-robot-using-Livebots/step5/Get-the-webcam-streamer-for-Raspberry-Pi/

http://wolfpaulus.com/jounal/embedded/raspberrypi_webcam/

http://sirlagz.net/2013/02/12/quickie-getting-motion-working-on-the-raspberry-pi/

http://www.robotplatform.com/howto/L293/motor_driver_1.html

http://luckylarry.co.uk/arduino-projects/control-a-dc-motor-with-arduino-and-l293d-chip/

Standard
Inspiring, Machine learning, Raspberry Pi, Robotics

Raspberry Pi Spotter

 

pic4

What is it?

Rapberry Pi Spotter basically recognizes the features in a given image. It analyses the images taken from a webcam connected to the Raspberry Pi and tells us what is in the image. Please see the video of jetpac in my previous post.

Why it is important?

The importance of this software is that, it does all this image analysis in offline. One might wonder how come the Raspberry Pi with little CPU power can do this. As I have explained in the previous post, Pi does it through its powerful GPU.

I thought it is really cool, if I can implement it on a simple Robotic platform. The robot goes around and keep  informing us what it sees in its surroundings, just like in the video shown by Jetpac.

What did I do ?

SP_A0030

So I have constructed a basic robot with an RC car, and connected a webcam to Pi (The construction details and better pics will be detailed in the next post).  Then  I have Installed the webiopi to get the video stream. Then I have implemented the browser interface as shown  in the above pic for the Raspberry Pi.  Now the Robot can be controlled from any device with a browser. I will upload the  demonstration video soon.

How does it work?

The arrow buttons on the interface control the Robot’s movement. In the above, the video is streamed just like the cambot developed by Eric / trouch. The yellow button above the video stream is to implement the deepbelief software.

When we click on this spotter button, webiopi will search for the latest image captured by the webcam and deepbelief will find the features in that image. Then it displays the results in the text box above the spotter button.

How to install it?

In my previous post, I have explained how to install the deepbelief network on Raspberry Pi. You can find the instructions to install the webiopi here.

I have uploaded my code into github. Note, that the code is really very rough at the moment. This code will work, even if one does not have a Robot. In that case the arrow buttons do nothing. However, video feed will be analyzed as in the original video of jetpac.

First, one has to download the files of cambot.py and index.html into a folder, where the deep belief software was installed previously. We also need to configure the settings of motion, so that it will stream the images directly into that folder. I will write detailed instructions with videos when I get more time.

Comments:

Results are not that great.  One reason might be I am not looking at real life objects. Anyway, I am not concerned about the quality of results yet.  Also, my coding skills are very limited. I hope people with better knowledge of python, in particular webiopi and deepbelief can make it work even better. One concern is that my Pi has only 256 MB of RAM. So it usually keep freezing under load. May be 512 MB RAM Pi will give a smoother experience.

Credits:

I would like to thank Pete Warden (CTO of Jetpac) and Toshi Bass (in Webiopi google group) for helping me throughout this project. I am also thankful to guys at stackoverflow for their help with the python doubts.

 

Continue reading

Standard
Machine learning, Raspberry Pi, Robotics

Machine Learning with Raspberry Pi

Although Machine learning appears as a high-tech term, we come across it every day without knowing it. For example tasks such as filtering of spam mails, automatic tagging of facebook photos are accomplished by Machine learning algorithms.   In recent years a new area of Machine learning, known as deep learning is getting lot of  attention, as a promising route for achieving artificial intelligence.

Until recently this knowledge of deep learning is confined to only big data centers. This is because,  the deep learning technology requires large amount of data sets, which only big data mining firms such as Google, Facebook, Microsoft have access to. To keep this technology in every one’s hand, a new startup  Jetpac  has given the access to their deep learning technology to everyone with a computing device (check their app). This is exciting because, many people have  mobile phones, which have so much of computing power. Just see what can be done with such kind of democratization of technology in the above video.

Now coming to the Raspberry Pi, it has roughly 20 GFLOPS (almost same as the Parallela board offers) of computing power, thanks to its GPU. After Broadcam has released the documentation for the GPU specs, Pete Warden has done a great job by porting his Deep Belief Image recognition SDK to the Raspberry Pi. Today after seeing about this post in the Raspberry Pi Blog, I have tried to follow his instructions and successfully run the first example on my Pi.

Instructions to install Deep Belief on Raspberry Pi

This algorithm requires at least 128 MB of RAM dedicated to GPU. To allocate that, we need to edit /boot/config.txt. We do it by using the following command

sudo nano /boot/config.txt

Then add the following line at the end of  /boot/config.txt

gpu_mem=128

save and exit the editor. Now we have to reboot the Raspberry Pi to get a ‘clean’ area of memory.

sudo reboot

Install git by using the following command

sudo apt-get install git

We are now ready to install the Deep belief. Just follow the below instructions

git clone https://github.com/jetpacapp/DeepBeliefSDK.git
cd DeepBeliefSDK/RaspberryPiLibrary
sudo ./install.sh

That’s it. You have installed one of the best Machine learning algorithm on Pi. Now to test whether everything is working or not, hit the following commands.

cd ../examples/SimpleLinux/
make
sudo ./deepbelief 

If everything goes well, you should see the following output

0.016994    wool
0.016418    cardigan
0.010924    kimono
0.010713    miniskirt
0.014307    crayfish
0.015663    brassiere
0.014216    harp
0.017052    sandal
0.024082    holster
0.013580    velvet
0.057286    bonnet
0.018848    stole
0.028298    maillot
0.010915    gown
0.073035    wig
0.012413    hand blower
0.031052    stage
0.027875    umbrella
0.012592    sarong

Pete Warden has explained how to implement this algorithm for various applications on Gitgub. I would like to use this for my Robotic project, so that my robot can recognize the objects around it, just like the Romo in the above video. I am planning to do this with Open CV.

Note:

If you don’t allocate sufficient RAM for GPU, you may get the following error.

Unable to allocate 7778899 bytes of GPU memory
mmap error -1

References:

https://github.com/jetpacapp/DeepBeliefSDK/tree/master#getting-started-on-a-raspberry-pi

http://petewarden.com/2014/06/09/deep-learning-on-the-raspberry-pi/

http://www.raspberrypi.org/gpgpu-hacking-on-the-pi/

http://rpiplayground.wordpress.com/

Standard
Raspberry Pi, Robotics

Raspberry Pi Notes

Hi,

Today, I am going to share some of my frequently used things with Raspberry pi.

Change password:

change_password

For safety purposes , it is important to change the default password.

Just type the following command, it will bring the change password option.

sudo raspi-config

You are asked to type the new password twice.

 

Update and upgrade Raspberry pi:

sudo apt-get update

sudo apt-get upgrade

sudo apt-get update && sudo apt-get install rpi-update

The first two commands will update the  software packages.

The  last command will update the firmware.

Look into the below link for more details.

http://www.raspberrypi.org/documentation/raspbian/updating.md

 

Save some disk space:

To see the disk space hit the command

df -h

Now to clean some archived files use the following command, that will free up some space

sudo apt-get autoclean

 

Safely shutdown Raspberry pi:

The following command will safely shutdown your pi

sudo shutdown -h now

 

Clone your sdcard:

win32

Once downloading and upgrading your preferred settings, it is best to save these settings permanently. For this purpose, we use Win32DiskImager.

Start Win32DiskImager, and in the image file option, give the folder path  in your computer where you want to save settings as .img file. insert the sd card and select the win32diskimager.

Now  just press “read” tab. That’s it, you cloned your sdcard.

Next time, if your sd-card get corrupted, or you wanna restore your previous settings, just insert your sd-card, and chose this cloned .img file in win32diskimager.

setup VNC player:

I run my Raspberry pi headless, so it is very handy for me to see the desktop of pi with vnc player. The following command will install tightvnc server on pi.

sudo apt-get install tightvncserver

First time you require to set-up the password. So hit the following command

/usr/bin/tightvncserver

On windows computer, I install tightvnc viewer and xming to see the Pi’s desktop.

For more detailed steps, see the excellent tutorial in the following guide

http://www.penguintutor.com/linux/tightvnc

 

 

 

 

 

 

Standard
Robotics

Raspberry Pi Robot in action

In this post I would like to share few pics and video of the RC car modified Robot.  For the time being I am controlling only forward and backward motion of the Robot. Although I connected the Left/Right  steering motor to the Pi, I did not shoot the video of that control. We can give intelligence to the Robot by connecting sensors such as Ultrasonic, IR.  I will do that later, if time permits. I also wish to shoot a better video in future!SP_A0024

SP_A0025

SP_A0030

Standard