The UCLA bio-engineers have designed a glove-like device that can translate American Sign Language into English in real time via a smartphone application.
Bringing the deaf and dumb out of their isolation.
Living in this world is a challenge for the deaf and dumb. There are only a few places that can be considered welcoming for them. Human interaction is also difficult, as many do not know sign language. Fortunately, there are people who are committed to making the world a better place.
We hope this will allow people who use sign language to communicate directly with others without the need for someone else to translate,” said Jun Chen, assistant professor of bioengineering at UCLA’s Samueli School of Engineering and lead researcher. “In addition, we hope it can help more people learn sign language themselves”.
How does it work ?
The system consists of a pair of gloves with thin, stretchy sensors that are the length of each of the five fingers. These sensors, made from electrically conductive wires, sense hand movements and finger placements representing letters, numbers, words and phrases.
The device then transforms the finger movements into electrical signals, which are sent to a coin-sized circuit board worn on the wrist. The card transmits these signals wirelessly to a smartphone, which translates them into speech at a rate of about one word per second.
A lightweight and inexpensive device
Researchers also added adhesive sensors on the testers’ faces, between the eyebrows and on one side of the mouth, to detect facial expressions that are part of American Sign Language.
Older handheld systems that offered translation from American Sign Language were limited by cumbersome and heavy devices or were uncomfortable to carry, Chen said. The device developed by the UCLA team is made from lightweight and inexpensive, yet durable and expandable polymers. The electronic sensors are also very flexible and inexpensive.
Adjustments are still to be made.
To test the device, the researchers worked with four deaf people who use American Sign Language. The wearers repeated each hand gesture 15 times. A customised learning algorithm transformed these gestures into the letters, numbers and words they represented.
The system recognised 660 signs, including every letter of the alphabet and numbers from 0 to 9. UCLA has filed a patent on this technology. A business model based on this technology would require additional vocabulary and even faster translation time, Chen said. Their research is published on the Nature Electronics website.