make it fashion

Video: Sign language glove turns hand movement into words

This incredibly light glove turns up the volume on sign language

Afterwork phone calls or Zoom hangouts with family and friends -- or even your therapist -- have become a crucial part of how we stay connected and sane during the Covid-19 pandemic. But for 30 million deaf or hard-of-hearing people in the U.S., these forms of communication can be much less accessible. Especially because only 250,000-500,000 in the U.S. and Canda even know American Sign Language (ASL.)

To help breakdown this barrier between signers and nonsigners, a team of researchers has designed an intelligent glove that can detect hundreds of different signs and translate them at real-time speeds into spoken speech. Weighing in at a delicate .05 lbs, this lightweight device can be worn around the wrist and fingers to detect over 600 different types of signed gestures with 98.7 percent accuracy on average.

By translating these gestures into spoken speech, the authors hope this device can breakdown language barriers between signers and nonsigners.

In a new study, published Monday in the journal Nature Electronics, a team of Chinese researchers describes why they believe previous ASL technology has missed the mark when it comes to effectively improving communication.

"Large-scale production and widespread use of these [sign language translation devices are] limited by a number of issues, including their structural complexity, the need for high-quality materials for fabrication, poor chemical stability, unsuitability for long-term wear, vulnerability to external environmental interference and cumbersomeness in practical use," write the authors.

For example, image-based processing devices often fail in low lighting, drastically lowering the accuracy of translations and the effectiveness of the devices in daily life.

Using conductive yarn a team of researchers designed an incredibly light-weight sign-to-speech translation device designed to make communication easier between signers and nonsigners.

Nature Electronics

Unlike these previous devices, the device designed by this team uses conductive yarn and a wrist-mounted circuit board. This conductive yarn is strung on each individual finger and generates electric signals using something called the triboelectric effect (which is essentially when static electricity is created by pulling apart two materials, like a fuzzy sock and a balloon.) This means that when gestures are signed, electric signals are translated to the circuit board and fed to a machine learning algorithm.

This algorithm was trained on 440 ASL signs and is able to process the raw movement data to weed out extra information (like noise) and use the rules it learned from its training set to interpret novel incoming gestures. When given 220 new signals to process, the team found that its device was on average able to accurately translate the gestures to spoken or written English 98.7 percent of the time -- and in less than one second. These results were either displayed on a nearby screen as written translations or spoken by a speech synthesizer.

The authors write that this real-time translation paired with the low cost of the device (costing a mere $50 to fabricate in the lab) is encouraging for the future commercialization and scaling of a device like this, including even embedding the technology into gloves and clothing.

"Compared with commercial systems of this kind (CyberGlove, $40,000; 5DT Glove, $1,990), our wearable sign-to-speech translation system is cost-effective, and this price could be further reduced through industrial mass production," write the authors.

This device still needs some improvement before its ready to hit the market. A system as complex as this one has multiple moving parts, and creating a display on the device that can speak or show the translation instead of relying on a disconnected screen is the team's next priority. But the authors write they're hopeful that this technology will improve how we communicate with one another.

"Without prior knowledge of sign language, it is difficult for non-signers to receive and understand this conversational medium," write the authors. "We envision that the wearable sign-to-speech translation system could improve effective communications between signers and non-signers."

Abstract: Signed languages are not as pervasive a conversational medium as spoken languages due to the history of institutional suppression of the former and the linguistic hegemony of the latter. This has led to a communication barrier between signers and non-signers that could be mitigated by technology-mediated approaches. Here, we show that a wearable sign-to-speech translation system, assisted by machine learning, can accurately translate the hand gestures of American Sign Language into speech. The wearable sign-to-speech translation system is composed of yarn-based stretchable sensor arrays and a wireless printed circuit board, and offers a high sensitivity and fast response time, allowing real-time translation of signs into spoken words to be performed. By analysing 660 acquired sign language hand gesture recognition patterns, we demonstrate a recognition rate of up to 98.63% and a recognition time of less than 1 s.
Related Tags