DIY, Electronics, Inspiring, LEDs, Lux, Robotics

DIY Vein viewing Device

The problem with veins:

I afraid of getting injections with syringes as it is a very painful experience. Unfortunately, for some people it is even more painful, if their veins are not clearly visible. In that case, usually nurses try to insert the needles into the body by guessing the vein’s position. Sometimes after three to four trails they have to change the spot and start to probe again for invisible veins. It’s particularly a problem for new born babies. Witnessing that needle punches itself is a painful experience.

Simple Solution:

One of the simplest solution to this problem is illuminating the veins by powerful LEDs. This solution relies on the fact that there is a change in colour of the blood, depending on whether it is carrying the oxygen or not. This change can be easily noticeable when veins are illuminated with red LEDs. By exploiting this fact, a company called veinlite made a device that consists of just LEDs (red and orange) and a battery to power them. It has been proved that this device works but it gets patented thus it costs $200 to $300. There are clones of this device, but they too cost ~$100, so these are not particularly affordable to most hospitals.

Open source version:

However, there is a nice guy called Alex, who made a open source version of veinlite and kindly shared the design files with instructions. Recently, I come across a friend who is suffering from this not so easily visible vein’s problem. Therefore, my friend’s hand was swollen and it’s really painful. So I attempted to build this vein viewer with the help of Uday. We just made a one small change to the original design of Alex, by adding a small potentiometer to adjust the brightness of the LEDs. Below are the few pics showing the build process and initial tests. I hope this will be useful to my friend. We didn’t have the exact switch used by Alex, so we adjusted the hole for the switch in the design accordingly. In the future, we will try the rechargeable battery version, if we find the cause. Please see the videos below to know, how it works.

IMG_20170816_101037

Top View of our 3D printed VeinViewer

IMG_20170816_101112_HDR

Bottom View of our 3D printed Vein Viewer

IMG_20170812_194006857

Vein Viewer during the soldering phase

IMG_20170812_220525185

First version printed with a wrong colour of a material.

References:

http://www.instructables.com/id/3d-Printed-Medical-Vein-Finder/

http://www.instructables.com/id/How-to-make-an-affordable-Vein-Finder-for-use-d/

https://3dprint.com/11056/3d-printed-vein-finders/

 

 

 

Advertisements
Standard
ARDUINO, DIY, Electronics, Inspiring, LEDs, photography, Raspberry Pi, Robotics

Experiments with a Light Meter

Why I am interested in Light meters?

When I was working in Scotland, I came across photo dynamic therapy (PDT), which uses light sensitive drugs to kill cancer cells. In the entire UK, there are only two PDT centers (afaik), one of which is in Dundee. By visiting the PDT center in Dundee, I realised that after applying the PDT drugs, doctors ask patients to wait in  sun light for two hours. There is no particular reason for exactly two hours of exposure to sun light. Therefore, it is not possible to know how much light dose has been received by the patients. To address this problem, PDT center at Dundee measured sunlight across the UK and Ireland and suggested that  cheap lux meters can be used to measure the required lux dose. I met with one of the PI and discussed about this in detail.

The problem with cheap light meters:

However, most commercially available cheap lux meters can only give instantaneous measure of light. These are originally developed for photographers to know lighting in their photo and building mangers to know lighting in a room. But PDT application  requires the lux values to be logged, aggregated to know whether the required light dose is reached. I think the only way to realise that is through connecting the lux meters to a microcontroller and stream the values to a smartphone. For that I am going to use a cheap lux meter that I can confidently modify after reading this blog post .

What I did:

I ordered the lux meter with a brand name “Ceto”from the same vendor as suggested in the blog post mentioned above. I identified the pins required to tap to get the lux values out. These are the pins on the amplifier. I soldered wires to these pins to read the voltage. So effectively LUX values are converted to voltage values in this lux meter. For example LUX of 290 is converted as 0.288V. I connected these wires to a multimeter to  see these voltage values.

IMG_20160722_234716581_HDR

Guts of the LUX meter

IMG_20160722_234721797

Zooming inside

IMG_20160722_234741644

Red wire is signal

IMG_20160723_241350983

Black wire is ground

IMG_20160723_241816577

Hot glue to keep the wires in place

IMG_20160723_242142465

Made a hole to the case to let the soldered wires come out, so that I can feed them into a multimeter

IMG_20160723_242159688

More hot glue to fix the wires to the case

IMG_20160723_242631030

Connected the wires to multimeter and we can see the light values appearing on the Multimeter as voltage values.

In the next step, I will connect the lux meter to a Arduino Uno and Bluetooth so that its possible to record the  aggregated lux values overtime time to determine the light dose for PDT treatment. I will write these details in another post.

P.S: It is just one of my hobby project, not related to my research.

Standard
DIY, Electronics, Inspiring, Robotics

DIY Toy Centrifuge

Why I like a centrifuge?

Whenever I see a motor, I think why shouldn’t it be converted as a centrifuge. I like centrifuge as a scientific instrument, especially after seeing Lab on DVD systems to diagnose diseases. Recently, I came across Manu Prakash’s paperfuge, where whirligig/buzzer toy was modified to get high speed centrifuge without using any electricity. Although, I like the idea, it still takes more than 15 minutes to separate blood to any useful analysis such as malaria detection.  May be there are better ways to improve the existing technology to get a better centrifuge, a cost effective, functional, may be little bit funny one. Latest open source models use brushless motors used in drones to make a centrifuge. I would like to try that idea. However, one has to spend at least spend $30 to make such open source centrifuge. I would like to make a low-cost, fun toy type centrifuge, so that we can teach kids about centrifuges without spending so much.

How I made One:

I took a brushless DC motor from a CPU cooling fan and attached a conical shape of plastic that I cut from a water bottle. It looks good, I am getting decent speeds with a power bank or a computer USB. Look at the videos, where I tried to separate milk, which is not possible with this toy centrifuge. I am sure that we can separate some suspension solutions which I will try soon. So far the plus points of my design are that it doesn’t require any soldering, 3D-printing. I am planning to enclose it in a cardboard box for safety reasons, although current version doesn’t spin at high speeds to make any damage.

dsc_4243

Top part of a water bottel

dsc_4244

Attach the cap to a PC fan

dsc_4228

Glue to attach the cap

dsc_4274

Finally, PC centrifuge is here

Future Plans

I am trying to make a centrifuge that can go up to 16,000 rpm, with a system similar to the above. I already designed a 3D printed holder for tubes. I will update about it soon. Until then enjoy the footage of toy centrifuge video.

Standard
ARDUINO, Inspiring, Robotics

Can we 3D print Human beings?

Introduction:

Until and unless, you are hiding under caves, you might have heard about 3D printers. If not, 3D printing is just one step more than 2D printing. In normal 2D printing,  information is printed on a surface (usually a piece of paper).  In 3D printing, we can construct three dimensional objects by continuously printing one layer on top of another layer. Using this advanced technology, we can print plastic toys, concrete houses, custom shaped) chocolates (ah Chocolates :p), cars and even rocket engines. Most of the time, we see hobbyists using the 3D printers to make cool plastic models.

What bio scientists can do with 3D-printers?

They can use 3D printers to print the artificial organs using single (live) cells as building blocks. Recent examples of brain cell, heart embryos, prosthetic skull, and legs are evident of the exponential growth in the 3D printing technology.

That immediately, brings up the question, can we put together the 3D-printeded tissues of heart, brain, skull, skeleton, and nerves to make/3D print a complete animal, even a human being?

Does God permit us to do so?

Most of the religious scriptures tell us that we are all constructed from the basic elements soil, water and air. Bible says 3 elements, Baghadgita says 5 elements and Quran says just 2 elements. So the point is somehow it’s possible to copy the Nature, if not the God.

So can we do it?

Let’s do the math first. A human body consists of 10^14 or 100 Trillion single cells. 50 Million cells die/replaced every second.

Little philosophical pause

That means part of us are die and reborn every day. Can you appreciate the beauty of God’s creation for a moment? How complex giants are we?

Let’s come back to science

Using the current 3D printing technology developed by Prof. Boris Chichkov in Germany, we can print 10^8 single cells per second. That means a complete human body can be printed in 2 hours and 47 minutes. Hold your breath. Prof. Boris Chichkov is claiming that it’s even possible to improve the current technology to print 10^10 single cells per second, we can 3D-print a man/woman in 30 seconds. I don’t know about you, but my heart skipped a beat when I was looking at those numbers for the first time. So many philosophical questions were blowing my mind, and I don’t have answers for any of them.

Why do you care?

Because, printing the individual organs, if not the entire body has some immediate benefits:

  • We can print the organs such as hearts and liver to save many lives.
  • For example, In India 200,000 people need a new kidney every year and 100,000 need a new liver. We can stop people dying due to lack of organ donors.
  • In many cases, like developing countries, organ replacement is a costly affair and it involves many kinds of bad things such as human trafficking. So we can avoid that mess, by simply printing the required organs, like how we buy an injection in a shop.
  • The 3D printed bodies can be used as pedagogic tools to teach biology/anatomy to students. Imagine saving lives of many frogs and rats in scientific labs to learn biology.
  • We can rapidly test the drugs on artificial organs, before going for trails on rats and human beings. It will save serious money and time to develop new drugs.

 

Why do I care?

Because, apart from the above reasons, I am making a syringe pump in recent times. Using that I am going to make a bio printer, not to print organs but to print biostructures such as….I don’t know yet. But why? I am just curious. How? Will tell you soon. Is it my main research? Hell no. It’s just one of my hobby project. So it will take some time to see the pictures and videos of me doing bioprinting.

Take away message:

Yes, it is possible to 3D print human beings, or in general animals (at least in principle). Like any technology, it’s more useful than harmful when used in the right spirit.

Standard
ARDUINO, Raspberry Pi, Robotics

My experiments with stepper motor

Its a different game

At the starting of the project I was thinking that it will be fairly straight forward to hook a NEMA 17 stepper motor to Arduino/Raspberry Pi. After all, I have an experience of connecting a 24byj48 stepper motor to both Raspberry pi and an Arduino, without any fuss. The beauty of this 24byj48 stepper motor is that it comes with its own driver board, all including just for $3 (thanks to China). That gave me confidence, that even if I burn it, I don’t mind.

MOTORS_BLOG

But..

now I have a NEMA 17 motor, which is mostly used in 3D printers and CNC machines. It costs around $40 (when we buy from a well known vendor, not from ebay). Now the problem is choosing a correct driver board, as this motor does not come with its own driver. Moreover, Arduino/Raspberry Pi cannot directly supply the enough juice to run these motors.

Choosing a correct driver board

After stumbling upon few options, I found that Poulou’s A4988 driver is most appropriate (thanks to Google). I know people running NEMA 17 with L293D chips (I used them a lot in  my robotic projects to run DC motors). But I am really worried about the shiny (and costly too) new motor, and I definitely don’t wanna burn my NEMA 17.  So, I finally ordered A4988 driver board.

And I fried it..

I followed an excellent connections map (and also the Arduino code given there) from this post on Arduino forums. Almost in every hacker blog/forum, this diagram is posted. I powered my board with 12V wall adaptor. I was thinking that, I was giving enough power to motors. But motor did not rotate at all. I checked my connections, rechecked…and rechecked.

a4988 diagram

 

First problem:

My 12V wall-adaptor does not output enough current output. I don’t know, how I failed to check this minor detail. Anyhow, I have replaced the power source with a 8 AA battery pack. And it rotated. I rejoiced and wanted permanent connections, instead of messing up everything on  a bread board with lose connections.

Second problem:

When I soldered everything on a prototype board, for some odd reason, which I have no clue even now, motor did not rotate at all. I was frustrated. First sign of failure. Went back to breadboard, abandoning the prototype board. I reconnected the motor pins and exchanged connections, quite a number of times.  I hate my hacker mind for tempting do this.

Because..

There is a small warning that says, we should not plug or unplug a stepper motor, when the driver board is powered. I learnt this after reading few forum posts, carefully. What happens, if you ignore the warning. Simply your driver board will be fried. And it happened to me.

How I solved it:

Simply ordered a new board, which arrived in just 2 days, thanks to a ebay UK supplier.  Now with the new motor driver, I have taken at most care and connected every according to the above circuit diagram. My motor is finally running.

But..

still there is a small problem. Motor is vibrating while rotating. It is no where near to smooth running. Googled again patiently. The solution suggested was that interchange the connections going to the Motor. Now, I have checked spec sheet of the motor, colour code of NEMA 17 wires and read quite a few forums. Result is that, interchanging the motor pin connections simply wont work. Back to square one. I suspected that Arduino code might be a problem.

Yes it is..

I have changed now to an excellent stepper motor library written by good guys at Adafruit. I installed the AcclStepper library and it worked like a charm. Finally, my stepper motor is now running smoothly. After spending almost 4 late nights, I figured it out. I am ashamed to waste so much time on such a simple task. But that is how a newbie scientist learns.

Lessons learned:

  1. Never, plug and unplug a motor connections, when the driver board is powered.
  2. Don’t blindly use a high voltage power source, without looking at its current supply capacity.
  3. Better use 8 AA battery pack, rather than a fancy 9V battery.
  4. If you burned a driver board already, that’s alright.  Don’t waste figuring out the connections. Just order a new board. That will save lots of time and save your sanity and few hairs on your head.

  5. Once after figuring out that your motor is working, better use a AcclStepper library, than the raw Arduino code. It would make your motor and your life easy and smooth.

Codes:

Simple Arduino code to test the running of the stepper motor

//simple A4988 connection
//jumper reset and sleep together
//connect  VDD to Arduino 3.3v or 5v
//connect  GND to Arduino GND (GND near VDD)
//connect  1A and 1B to stepper coil 1
//connect 2A and 2B to stepper coil 2
//connect VMOT to power source (9v battery + term)
//connect GRD to power source (9v battery - term)


int stp = 13;  //connect pin 13 to step
int dir = 12;  // connect pin 12 to dir
int a = 0;     //  gen counter

void setup() 
{                
  pinMode(stp, OUTPUT);
  pinMode(dir, OUTPUT);       
}


void loop() 
{
  if (a <  200)  //sweep 200 step in dir 1
   {
    a++;
    digitalWrite(stp, HIGH);   
    delay(10);               
    digitalWrite(stp, LOW);  
    delay(10);              
   }
  else 
   {
    digitalWrite(dir, HIGH);
    a++;
    digitalWrite(stp, HIGH);  
    delay(10);               
    digitalWrite(stp, LOW);  
    delay(10);
    
    if (a>400)    //sweep 200 in dir 2
     {
      a = 0;
      digitalWrite(dir, LOW);
     }
    }
}

 

Now to get smooth rotation of the motor, I have used the following code, which makes use of AcclStepper library.


#include <AccelStepper.h>

AccelStepper Stepper1(1,13,12); //use pin 12 and 13 for dir and step, 1 is the "external driver" mode (A4988)
int dir = 1; //used to switch direction

void setup() {
  Stepper1.setMaxSpeed(3000); //set max speed the motor will turn (steps/second)
  Stepper1.setAcceleration(13000); //set acceleration (steps/second^2)
}

void loop() {
  if(Stepper1.distanceToGo()==0){ //check if motor has already finished his last move
    Stepper1.move(1600*dir); //set next movement to 1600 steps (if dir is -1 it will move -1600 -> opposite direction)
    dir = dir*(-1); //negate dir to make the next movement go in opposite direction
    delay(1000); //wait 1 second
  }
  
  Stepper1.run(); //run the stepper. this has to be done over and over again to continously move the stepper
}


Credits

http://forum.arduino.cc/index.php?PHPSESSID=c4am9ddu9ol3f8i14vu6o3aul7&topic=133894.msg1449404#msg1449404

http://electronics.stackexchange.com/questions/95238/stepper-motor-rotating-but-vibrating

http://www.linengineering.com/resources/wiring_connections.aspx

https://cdn-reichelt.de/documents/datenblatt/C300/QSH4218_manual.pdf

http://www.robotdigg.com/news/26/Reprap-Stepper-Motor

http://semiengineering.weebly.com/blog/stepper-driven-by-arduino-a4988

https://github.com/adafruit/AccelStepper

Standard
Inspiring, Machine learning, Raspberry Pi, Robotics

Raspberry Pi Spotter

 

pic4

What is it?

Rapberry Pi Spotter basically recognizes the features in a given image. It analyses the images taken from a webcam connected to the Raspberry Pi and tells us what is in the image. Please see the video of jetpac in my previous post.

Why it is important?

The importance of this software is that, it does all this image analysis in offline. One might wonder how come the Raspberry Pi with little CPU power can do this. As I have explained in the previous post, Pi does it through its powerful GPU.

I thought it is really cool, if I can implement it on a simple Robotic platform. The robot goes around and keep  informing us what it sees in its surroundings, just like in the video shown by Jetpac.

What did I do ?

SP_A0030

So I have constructed a basic robot with an RC car, and connected a webcam to Pi (The construction details and better pics will be detailed in the next post).  Then  I have Installed the webiopi to get the video stream. Then I have implemented the browser interface as shown  in the above pic for the Raspberry Pi.  Now the Robot can be controlled from any device with a browser. I will upload the  demonstration video soon.

How does it work?

The arrow buttons on the interface control the Robot’s movement. In the above, the video is streamed just like the cambot developed by Eric / trouch. The yellow button above the video stream is to implement the deepbelief software.

When we click on this spotter button, webiopi will search for the latest image captured by the webcam and deepbelief will find the features in that image. Then it displays the results in the text box above the spotter button.

How to install it?

In my previous post, I have explained how to install the deepbelief network on Raspberry Pi. You can find the instructions to install the webiopi here.

I have uploaded my code into github. Note, that the code is really very rough at the moment. This code will work, even if one does not have a Robot. In that case the arrow buttons do nothing. However, video feed will be analyzed as in the original video of jetpac.

First, one has to download the files of cambot.py and index.html into a folder, where the deep belief software was installed previously. We also need to configure the settings of motion, so that it will stream the images directly into that folder. I will write detailed instructions with videos when I get more time.

Comments:

Results are not that great.  One reason might be I am not looking at real life objects. Anyway, I am not concerned about the quality of results yet.  Also, my coding skills are very limited. I hope people with better knowledge of python, in particular webiopi and deepbelief can make it work even better. One concern is that my Pi has only 256 MB of RAM. So it usually keep freezing under load. May be 512 MB RAM Pi will give a smoother experience.

Credits:

I would like to thank Pete Warden (CTO of Jetpac) and Toshi Bass (in Webiopi google group) for helping me throughout this project. I am also thankful to guys at stackoverflow for their help with the python doubts.

 

Continue reading

Standard
Machine learning, Raspberry Pi, Robotics

Machine Learning with Raspberry Pi

Although Machine learning appears as a high-tech term, we come across it every day without knowing it. For example tasks such as filtering of spam mails, automatic tagging of facebook photos are accomplished by Machine learning algorithms.   In recent years a new area of Machine learning, known as deep learning is getting lot of  attention, as a promising route for achieving artificial intelligence.

Until recently this knowledge of deep learning is confined to only big data centers. This is because,  the deep learning technology requires large amount of data sets, which only big data mining firms such as Google, Facebook, Microsoft have access to. To keep this technology in every one’s hand, a new startup  Jetpac  has given the access to their deep learning technology to everyone with a computing device (check their app). This is exciting because, many people have  mobile phones, which have so much of computing power. Just see what can be done with such kind of democratization of technology in the above video.

Now coming to the Raspberry Pi, it has roughly 20 GFLOPS (almost same as the Parallela board offers) of computing power, thanks to its GPU. After Broadcam has released the documentation for the GPU specs, Pete Warden has done a great job by porting his Deep Belief Image recognition SDK to the Raspberry Pi. Today after seeing about this post in the Raspberry Pi Blog, I have tried to follow his instructions and successfully run the first example on my Pi.

Instructions to install Deep Belief on Raspberry Pi

This algorithm requires at least 128 MB of RAM dedicated to GPU. To allocate that, we need to edit /boot/config.txt. We do it by using the following command

sudo nano /boot/config.txt

Then add the following line at the end of  /boot/config.txt

gpu_mem=128

save and exit the editor. Now we have to reboot the Raspberry Pi to get a ‘clean’ area of memory.

sudo reboot

Install git by using the following command

sudo apt-get install git

We are now ready to install the Deep belief. Just follow the below instructions

git clone https://github.com/jetpacapp/DeepBeliefSDK.git
cd DeepBeliefSDK/RaspberryPiLibrary
sudo ./install.sh

That’s it. You have installed one of the best Machine learning algorithm on Pi. Now to test whether everything is working or not, hit the following commands.

cd ../examples/SimpleLinux/
make
sudo ./deepbelief 

If everything goes well, you should see the following output

0.016994    wool
0.016418    cardigan
0.010924    kimono
0.010713    miniskirt
0.014307    crayfish
0.015663    brassiere
0.014216    harp
0.017052    sandal
0.024082    holster
0.013580    velvet
0.057286    bonnet
0.018848    stole
0.028298    maillot
0.010915    gown
0.073035    wig
0.012413    hand blower
0.031052    stage
0.027875    umbrella
0.012592    sarong

Pete Warden has explained how to implement this algorithm for various applications on Gitgub. I would like to use this for my Robotic project, so that my robot can recognize the objects around it, just like the Romo in the above video. I am planning to do this with Open CV.

Note:

If you don’t allocate sufficient RAM for GPU, you may get the following error.

Unable to allocate 7778899 bytes of GPU memory
mmap error -1

References:

https://github.com/jetpacapp/DeepBeliefSDK/tree/master#getting-started-on-a-raspberry-pi

http://petewarden.com/2014/06/09/deep-learning-on-the-raspberry-pi/

http://www.raspberrypi.org/gpgpu-hacking-on-the-pi/

http://rpiplayground.wordpress.com/

Standard