-
Low-Cost Electronic Prosthesis With An Artificial Intelligence (AI) Using 3D Printer
Hikaru Shimada 2018 summer break
Abstract
This summer, I built a prosthetic hand using a 3D printer and Internet on Thing (IoT) devices. This will provide an alternative (to the expensive model that already exists) for low GDP countries. The prosthesis is expensive due to the inbuilt software and the mechanical parts that are built-in, as well as the materials used. It is hard for people in low gross domestic product (GDP) countries to afford the common expensive prosthesis. Even most of the people in my home country (Japan) would have a challenging time to pay for this. So we started to think if there are any methods to create a cheap, automated, electronic prosthesis. To lower its price, we thought of using a 3D printer which uses plastic to print an object. The benefit of building a prosthesis hands using a 3D printer is that we can replace any parts quickly and easily. Electronic components such as Arduino and Raspberry Pi1 are used to control the hand, this makes it easy and again, this makes it cost efficient to replace. To achieve the goal of automating the movements, we added object detection software that automatically selects the best way to grip the detected objects. We hope this prosthesis can help people that don't have a hand to chase their dream.
PROTOTYPE 7
- low cost prosthetic hand with AI - -
Introduction
Most practical prosthesis costs over $15,000 and are ultimately expensive and hard to afford. There are lots of types of prosthesis and robot hands that are created by using a 3D printer. The main reason engineers build prosthesis using 3D printers and to lower its price and, the user won't need to think of welding or coloring. It uses plastic to print out the object and this makes the prosthesis light. Also, the user doesn't need to wait for the shipments or the product to arrive so that makes the process of building faster. But most of the prosthesis that are on the internet are made for robot use and not designed for daily, or practical use. After watching a YouTube video of kids making an extended arm. I became really interested in the project and created one with a cardboard. Obviously, it didn't work when I tried to grab an object. About a year passed, I started to learn how to use and built a model using 3D computer-aided design (CAD). One day, I was glancing through the website for 3D projects and found a robot hand. We thought that we could build it with a 3D printer.
The goal for this project is to create a low-cost prosthesis hand roughly under $100 US dollars. According to the report of "Payment and Tasks of Myoelectric Wristbands in the Independence Support Law for Persons with Disabilities.", 219 prostheses were ordered in Japan in 2010. The percentage of decorative prosthesis made up 90%. Active type filled 5%, another 6% was for work use, and finally, the myoelectric prosthesis filled only 2%. The main reason there's only 2% of the myoelectric prosthesis is people need to get an approval from the welfare to get a money which reason needs to be something important that the reason is approved by the welfare.
Furthermore, this prosthesis can outsource to lower economic country such as Laos, Thailand and Cambodia. According on "Thailand: Deadliest country for motorcyclists" by Yasmin Lee Arpon, 1.74 million motorcycles were bought last year. Also, there are about 15 deaths a day that are motorcycle related. From this data, we could tell that they find using motorcycles cheaper and easier than riding a car and we could estimate that there are lots of severe injuries from the number of deaths each day. After researching, we made two main goal which is to build automated prosthesis in cheaper and to automate all the action such as grabbing an object. I think by publishing this project, people who lost their hand or doesn't have can easily build and use an object detection software to automate the movements to make their life easier.
Methods
In total, we built 7 prototypes. The first 2 prototypes were made to figure out the mechanism of a prosthesis and how it could be improved, so it would be a practical prosthesis.
The first 3 prototypes were made using strings to control the fingers. The benefit of using string was that it was easy to build the prosthesis, but the disadvantage was, that as we put certain amount of force on it, the strings started to become loose. Therefore, we needed to resize the length of the strings. However, the way the fingers bent wasn't natural. To fix this problem, we used the manual gears (which did not need to be calibrated) and allowed us to accurately control each finger and increased the pressure of each finger.
We used the shape of a normal hand but as we made more prototypes, we realized that using a bone structure would make much more space; this made the hand lighter and smaller. To achieve the goal of automating the prosthesis, we added object detection with pre-trained data sets and this worked using a camera feed to detect object. This would make it easier and quicker for a user to select an object.
PROTOTYPE E - low cost prosthetic hand -
PROTOTYPE 9 - prosthetic hand with AI -
PROTOTYPE 9 - prosthetic hand with AI - -
PROTOTYPE 1
My first prototype used string and a hair band made of rubber to straighten the fingers. We created two holes on the back of each finger to insert hair rubber so the fingers will try to retract and gets back to original place. However, the rubber used in each finger had varying levels of strength and therefore, it was difficult to use. We then used micro servos* which was small and strong enough to control the retraction of the fingers. After we used normal strings, we used kite strings which was more durable compared to the normal string.
PROTOTYPE 2
In Prototype 2, we used flex filament compared to the PLA filament we had been using. We used the flex filament because it reacted more like a rubber and therefore, gives equal amount of retraction force to each finger. However, after we printed this prosthesis, we found that the parts were not as flexible as we anticipated. Nevertheless, it did make all the joints for the fingers the same strength, using the same amount of power to pull the string, so we chose to continue to use this filament. Finally, we then added an arm to hold the servos and control the fingers.
Arduino
To control all the servos, I used Arduino Mega 2560* but this Arduino would have used a significant amount of space inside of the arm. Therefore, we used an Arduino Nano* which is about 6x smaller in size. The program for this prototype was simple to control. I programed an electrical loop which would create three movements: a grip, a pinch, and a semi-pinch.
PROTOTYPE 3
On this prototype, we changed the structure of the prosthesis to bone structure because we figured out that using human hand shaped model or bone structure doesn't make any difference in movement so to save up the amount of the filament and to save the weight, we decided to use the bone structure for this prototype.
By using bone structure, the weight of the hand got much lighter than the previous two prototypes.
On this prototype, we added one servo on the joint of thumb to rotate and made easier to pinch using the index finger.
We changed the string to clear nylon string that is thinner and lighter. We were able to change to nylon string because of the joints. The joints are connected using a thin metal rod.
To retract the fingers, instead of using rubber of flex filament, we used another nylon string to bring the fingers back straight.
Because it is build based on bone structure, the fingers sometimes go to the back side because there is no limit to the string that brings back the fingers. To avoid fingers to go another side, we created a back cover for prototype 3. It has 4 stoppers to avoid fingers to go to another side.
But as we build this prototype, strings started to get loosen up. We also noticed when the string is in high temperature and in a humid area, it also loosens up.
Arduino
To control each finger, I added a myoelectric sensor which detects the muscle strength using 3 sensor pads. This allowed the muscle to control the servos.
PROTOTYPE 4
The main difference from 3 prototypes are so far we used rods and gear mechanism to control fingers. To cover the problem of strings loosening up, we used a gearing method, so it moves accurately and it allows the servo to output full power because unlike strings, metal rod can push the finger with the maximum power.
To allow this prototype to do a snap movement, we added a wrist using larger servo and connector on the bottom of the palm.
Since the fingertips are made from plastic, we added a rubber coating on fingertips to increase the friction so if it pinches an object it'll avoid things to fall off.
We were planning to use 3D printed rods to connect servo and finger, but the rods often cracked in half. The solution we found was to use a rod that is made from aluminum. Two ends have bearing on it which created smoother movement. Also, we figured out that if we change the filling rate to 100%, it will become hard enough that it's going to need quite an amount of force to fracture the parts. So, either if the user wants to use the aluminum stick or 3D printed rod, the user experience of using this hand won't have that much of difference.
Arduino & Raspberry pi
v1
Using the code from prototype 3, we made few adjustments to the angles of servos to bend at the right places and added a movement that makes a grip, pinch, and semi-pinch with the middle finger and runs that in a loop.
v2
We added a program that if the specific word was sent and if the myoelectric sensor detects muscle strength, the prosthesis will do the selected movement.
v3
On V3, we created an object detection script using python and google TensorFlow model called inception v3. The detection works using video output from USB camera. It then selects the best way to grab the object and sends the data to Arduino.
AI recognitation test for prosthetic hand
-
PROTOTYPE 5
v1
This prototype was made to go over the problem on prototype 4 which was when the fingers refract, the rod that connects fingers parts comes forward and blocks the way. This prototype's connector makes the finger parts bends backward so the rods won't block the way. By making the connectors bend backward, it creates a double amount of space to grab objects. Also to make the fingers work like our hand, we changed the length of each fingertip. This way it'll look natural.
v2
Place of the thumb changed to below the index finger so it'll refract directly towards the index finger and increased the power. Also, we added extra servos on the bottom of the thumb so it can rotate and do the movement of pinch using any fingers. But there was a problem of rotating the entire thumb. We used the servo holder on the palm but since the thumb rotated and moved, it cannot refract the finger. To solve this issue, we created a mount on the thumb to hold the servo. This increased the stability of thumb to refract.
v3
The sensor of the myoelectric sensor was really hard to sense the myoelectric. This was a major issue because it oftentimes makes faulty signals. We changed the sensor to Infrared distance sensor so it will be really accurate.
Raspberry pi
v1
To allow the user to control, we built an HTML-based control panel that has 3 buttons that can control the servo.
We also added a program to reprogram the movement so it will remember the grip movement as what the user has programmed. This program is GUI based and there are 3 buttons which are the 3 main movements (Grab, Pinch, and Semi-pinch).
v2
The problem about the previous program was it was a separate program and I wrote a script that allows us to run both programs but this made the CPU usage much higher. To solve the problem, we inserted a GUI program that allows the user to overwrite the output of each object. This solved the high usage of CPU.
PROTOTYPE 6
We changed the strategy to grab an object by changing the type of the finger refraction to four bar linkage mechanism. This made the process of grabbing an object whole lot easier. This mechanism made the finger refract at the just right position like our finger does. We also changed the mechanism of the thumb to make it more like the human hand. The thumb has one fewer joint compared to other fingers and we discovered that thumb is mostly used with the combination with other fingers to grab an object in our daily life. To resemble the human hand, we used the prototype 5's thumb.
Underactuated mechanism
Making the fingers easily to replace, we changed the construction of fingers to work on its own by placing a servo on the button of each of the fingers and applying the gear mechanism making the fingers refract efficiently.
To make the fingers fit in the palm, we redesigned the palm. Since the fingers have servo included on the bottom of each finger, there's no need to place all the servos on the palm so Arduino can be placed inside the palm and also if any of the fingers break, they can be replaced easily because of the modular parts..
Not waterproof, components overheating, and weight of prosthesis. To cover the problems above, I'm planning to place all the electrical components inside the arm and seal it up so if the water splashes on it, the arm cover will protect. But if we seal up the components are if any of the components breaks, the user needs to break in and it will be hard to seal it up unless if there is any way to add water resistance to all the electronic components. Overheating of the components are another issue. The main causes are because of the energy use is high such as servo, power supply, and raspberry pi. To cool down the components, we have few options which are using Fan cooling, heatsink and water cooling. All the problems above needs to be solved or tested to prove that this low-cost prosthesis is suitable and is made for practical use.
Movement test for prosthetic hand
Raspberry pi
v1
We redesigned the GUI by adding another button to turn on the object detection software so the user doesn't need to type codes to turn on the program.
To make the reprogramming of the way to grab an object, we made a change to the method. In the prototype 5's method, to overwrite, we added a button and pressed when the object detection software recognizes the object but we concluded that it was too hard and the user will struggle for sure. To make the overwrite program users in a friendly way, we came up with the idea of using object detecting software but only for the overwriting. Also making this software, I needed another GUI that user will be able to choose which movement they want to overwrite to.
v1.5
As I mentioned in the previous version of the raspberry pi software, I added another window that the user will be able to control hand manually by pressing the system button.
-
PROTOTYPE 7
We made another model of prototype 6 but this time, we mostly changed the system. We printed the parts using carbon-fiber to make the parts still and lighter, which made 20% lighter compare to prototype 6.
Arduino & Raspberry pi
I changed the program of the Arduino to make compatible with prototype E and AI mode. This board doesn't have any buttons to switch the modes. Instead, it has another angle sensor board connected with 2.4GH transceivers. This allows the user to switch by swinging their other arm that has the sensor board. Also, it has muscle sensor on the controller so there's only one cable that's connected to the prosthesis.
PROTOTYPE 7 - electric prosthetic hand with AI
PROTOTYPE E
Prototype E is designed for the user that thinks the object detection software is unnecessary or can't afford it. It only requires 3D printed hand parts, Arduino, and few electrical components to build it. Prototype E does a basic movement that could support users in daily tasks.
Excluding the price of 3D printed parts, it will cost approximately $50 US dollars. Also if the user wants to upgrade to the prosthesis with an object detection, they could just simply print the adapter and buy few more components.
PROTOTYPE E - electric prosthetic hand -
Arduino
v1
This Program is made especially for the user that doesn't need the object detection. I made a circuit that has 3 dedicated buttons which do grip, pinch, and semi-pinch. Also, there is 3 LED with a different color to show which movement it is doing real time and to show if there's an error.
v1.5
To compact the breadboard, we created a circuit that has 2 boards which are used to control the movement and for the power source. By making control board, we can extend the cable and attach to any place on arm and increase the space inside the arm. Also, we changed the power adapter to micro USB and lithium battery so it can run on battery anywhere. To compact the modules, we also printed a case to stick it inside the arm.
Wireless movement controller
v2
I re-designed the circuit board for prototype E. To make the size of the circuit minimum as possible, instead of using an Arduino nano, I used Arduino pro mini which is smaller than Arduino nano. Also adding the resistors to the switch buttons improved the accuracy of the button to change the mode and also fixed the false signal to Arduino. Since we changed it the voltage of the battery, it cannot be plugged like how v1.5 worked so we added a voltage regulator to down the voltage that's provided by the battery from 12v to 5v
v3
I did some cable management and found a solution to hide trh cable for sensor sticking out from the hand to disappear by using radio transmitter. This transmitter has connector for the sensor and also gyro sensor to detect the swing of the radio receiver and as the it gets swing, it changes the mode for the gripping.
Wireless movement controller V2 & V3
-
Results
As we made the prototypes, we discovered 2 main limitations that need to be addressed in the future.
PROTOTYPE 9 - electric prosthetic hand with AI
Power problem
During the test, we figured out that the battery we were using was under voltage that it couldn't produce what it actually wanted. Therefore, we measured the voltage needed to run this device. Which it used around 10v at maximum usage. This was easily solved by adding another lithium ion 18650 parallel to the battery which produced 12.6v.
Object detection
Object detection predicts the efficient way to grab an object that is recognized and about 90% of the time, it does the right recognition, but we don't think that it's high enough that user will use it without problems so, in future, we would like to collect the recognized images from the prosthesis to analyze the data and create a datasets for this prosthesis using file server and deep learning.
PROTOTYPE 9 - electric prosthetic hand with AI
AI on a low cost prosthetic hand
-
Discussion
The target market of this product are people that live in low GDP areas. Particularly those who have been in the accident and those with physical disabilities. By using this prosthesis, user maybe could contribute to increase the GDP of their country instead of working without hand or spending whole life to buy a practical prosthesis that is expensive. The reason it particularly important for those in lower GDP to access this product are so they could have an affordable prosthesis.
In future research, we would like to test out the ability of this prosthesis and how it can help user such as driving, cooking, exercise, etc. We would like to test the durability and the battery life. Furthermore, we aim to reduce the cost of prosthesis to make it more easily affordable, and find a volunteer who could use the prosthesis and provide us with feedback to further improve the quality of the prosthesis.
We would like prosthesis users to be able to reprint any parts that break. However, this creates a challenge. 3D printers are expensive and therefore, difficult to afford. A solution to this would be to provide all users with extra parts (particularly for fragile parts) when purchased.
Additionally, to fully automate the prosthesis, I'm planning to add a deep learning engine instead of a trained object detection program. As the prosthesis recognizes a new object, a user can train the deep learning AI. This will help users to be able to detect a varieties of objects and therefore, the use of the prosthesis will be efficient. By building a data server and sending the newly learned object data to the cloud, we could update the data resources from the server to the hand. By collecting data, the deep learning engine on a server can analyze the data that was recorded from the prosthesis and firmware update all the hands, so it will constantly update the program, so that eventually a wide range of objects will already have been detected and uploaded to the server. Furthermore, we would like to make the object detection software and hardware universal, so it can be fitted in other prosthesis and can be used in with their preferred prosthesis model.
We were quite impressed by the object recognition speed because we were expecting that the detection of one frame might take a few seconds, but it outputted around 30 frames per second. Also, we were satisfied with the prosthesis that works only with Arduino. The 3D printed parts were not that stiff at first, but we figured out that by increasing the filling rate, printed parts will be able to absorb much more power when applied . Therefore from prototype 4, we printed the parts using 100% of filling rate and all the parts were much more durable than the previous 3D printed parts. We even dropped the prosthesis a few times from around waist height and it didn't cause any damage. Finally, there are number of parts needed for the prosthesis that cannot be printed by 3D printer, so we bought the parts needed (electronic components, bolts and nuts) from Amazon. Some parts took quite a time to arrive. Therefore, we thought of building these components, so we can make and deliver the prosthesis to the user in a more time efficient manner.
-
Conclusion & Resources
Conclusion
Overall, I created an alternative prosthesis option that was cheaper to create, durable, and could recognize an object ant select the more efficient way to grab it. In further research, I am thinking of collecting the images generated from the hand and analyzing the images and creating a dataset to improve the accuracy of the recognition. In the future, we would like to test the capabilities of the hand by asking for volunteers to test the product.
Overall, we hope that this prosthesis can be used to improve their daily life and get back the life they used to have or experience new type of life.
PROTOTYPE 9 - electric prosthetic hand with AI
Resources
Arpon, Yasmin Lee. "Thailand: Deadliest Country for Motorcyclists." The Straits Times, The
Straits Time, 20 May 2017, www.straitstimes.com/asia/se-asia/thailand-deadliest-country-for-motorcyclists.
Kashimoto, Shuu. "Payment and Tasks of Myoelectric Wristbands in the Independence Support Law for Persons with Disabilities." Payment and Tasks of Myoelectric Wristbands in the Independence Support Law for Persons with Disabilities, 22 Apr. 2013, pp. 305?308., www.jsomt.jp/journal/pdf/061050305.pdf.
PROTOTYPE 9 - electric prosthetic hand with AI
-
About me
My name is Hikaru Shimada. I was born in May 10th, 2003 and currently 15 years old. I am currently freshman in St.Mary's International School. City full of neon, Tokyo, is where I live. I'm grateful that I was born here because there is Akihabara, which is full of electronic components..
My favorite sports is wrestling and golf. I started to become interested of electrical components when I was in 1st grade of elementary school. I got a present from my parents called " Mini-Yonku" which is like a small car with motor and AAA battery. It doesn't have any steering and the most fun part was to build it. The parts are usually on this plastic base and to build a car, we have to cut it out carefully and connect all the parts. The car can be customized such as adding a washer that stabilizes the car, additional weight or motor that has much more torque.
Reason I started to program was my dad forced me to learn Java. It wasn't fun at all but I found a youtube video of how to program tensorflow and that interested me so much that I learned python and tensorflow.
Low-Cost Electronic Prosthesis with Object Detection
The research was conducted during 8th grade summer and the article was written during my high school freshman year (2019). This was then published on the Young Scientist Journal Vol.9 MAY 2019 60-62 hosted by Vanderbilt University.
Please feel free to contact me if there is any questions to "shimadahikaru1@gmail.com"