This project was done as the course project of robotic class at CST programme, DST, UIC.
The Motion Capture Robot Arm is a 3D printed robot arm, controlled by Leap Motion. The basic logic is, Leap Motion captures the motion of operator’s hand, then covert fingers’ vectors to angles and output to Arduino, finally the Arduino gives control signals to servos and servos power the fingers on the robot arm.
There are two parts of hardware in this project. Leap Motion and the robot arm. The hardware part was mostly done by Garfield Wu.
Leap Motion is a hardware with two infrared cameras launched in 2013. It can capture motions on two human hands and it has a well-encapsulated interfaces help developers find the “hands” quickly.
The robot arm was 3D printed and the model we used can be found here (a big thank to the author :)).
But we quickly found out the product we made was hard to use. Due to printers’ precision, most of the parts we printed cannot be used immediately and the printing process always fails. Before we could assemble the arm, we had to sand the surfaces to ensure all parts can move smoothly (kind of :joy:).
The fingers are powered by servos. Servos are installed inside the stand. And each finger is connected with a servo by fishing wire. By sending angles to Arduino, the included standard servo library can convert the angles to PWM singles, and the servos can turn to the angles pointed.
The parameters of converting finger angles to servo angles set differently for different servos. We spent some time on tuning these parameters.
In software side, I looked at the code from acali (check out here). He provided us with an easy way to finish this project.
We follow the acali’s idea, applied a tech stack base on Node.js because I have done a Node.js project SE Retail Management System. The project applies the following frameworks to achieve its logic.
Since I have the most experiences in software engineering, I was very happy to see this stack with libraries :).
Both LeapJS and Jhonny-Five works on my computer, LeapJS gets finger vectors, then we calculate them to finger and servo angles, then send angles by serial ports with Johnny-Five library.
At this moment, the robot arm can follow exactly what the human arm does. Because the all components are decoupled, we can easily switch any of them to achieve more functions. It can be entertaining.
Here are some components can be switched or expanded.