top of page
  • Writer's pictureThe Big Magazine Staff

Smart Glove Translates Sign Language Into Spoken Word in Real Time

Tuesday, June 30, 2020


A team of UCLA bioengineers have developed a smart glove that translates sign language into speech in real-time. The breakthrough technology encourages interactions and communication with non-signers for deaf people and those hard of hearing.



Photo of UCLA smart glove for sign language - Jun Chen


The glove features sensors along all four fingers and the thumb that connects to a circuit board, according to UCLA's newsroom blog. These sensors, made from electrically conducting yarns, pick up hand motions and finger placements that identify words, phrases, or letters in American Sign Language.


The device then turns the finger movements into electrical signals, which are sent to a dollar-coin–sized circuit board worn on the wrist. The board transmits those signals wirelessly to a smartphone that translates them into spoken words at the rate of about a one word per second.


Components used to make the device are cheap, flexible and highly durable, according to the team of engineers in the US, who plan to commercialise the technology if they can speed up the translation time even further. The device is comprised of light but long-lasting, stretchable and inexpensive polymers attached to a glove.


The study to test the device was conducted with four deaf people using American Sign Language. The wearers repeated each hand gesture 15 times. The system recognized 660 signs, including each letter of the alphabet and numbers 0 through 9.




Researchers added adhesive sensors to the faces of people used to test the device between their eyebrows and on one side of their mouths to capture facial expressions.


“Our hope is that this opens up an easy way for people who use sign language to communicate directly with non-signers without needing someone else to translate for them,” said Jun Chen at the UCLA Samueli School of Engineering.
“In addition, we hope it can help more people learn sign language themselves.”

In addition to Chen, the study’s UCLA authors are co-lead author Zhihao Zhao, Kyle Chen, Songlin Zhang, Yihao Zhou and Weili Deng. All are members of Chen’s Wearable Bioelectronics Research Group at UCLA. The other corresponding author is Jin Yang, of China’s Chongqing University.


UCLA has filed for a patent on the technology, although a commercial model based would require added vocabulary and an even faster translation time.

0 comments
bottom of page