Engineers from Michigan State University have recently devised a sign language translator to provide a ubiquitous solution to sign language translation. Unlike previous translator mechanisms, this new technology is non-invasive and as portable as a tube of Chapstick.
The technology dubbed as DeepASL features a deep learning – or machine learning based on data inspired by the structure and function of the brain – algorithm that automatically translates signs into English.
Most of the hard-of-hearing people rely on American Sign Language or ASL for communication. Without an interpreter, they don’t have similar employment opportunities and are in many cases left at a disadvantage in fragile or delicate circumstances.
Mi Zhang, assistant professor of electrical and computer engineering said, “Think about if you were in the hospital and needed to communicate with a doctor. You would have to wait for the hospital’s translator – if they have one – to arrive, connect with a toll-free service or rely on a family member to be present. This compromises your privacy and could worsen a health emergency. This is just one example demonstrating the critical need for sign language translation technology.”
The technology The innovation functions through a three-inch tangible gadget, created by Leap Motion, that is outfitted with cameras to catch the movements of hands and fingers in a consistent way.
Doctoral student Biyi Fang said, “Leap Motion converts the motions of one’s hands and fingers into skeleton-like joints. Our deep learning algorithm picks up data from the skeleton-like joints and matches it to signs of ASL.”
“Similar to setting up Siri on a new iPhone, users sign certain words to familiarize their hands and joints to the technology and sensors. They also can create custom signs for their names or non-dictionary words by spelling them out, and have more ease and comfort communicating.”
“One differentiating feature of DeepASL is that it can translate full sentences without needing users to pause after each sign. Other translators are word-for-word, requiring users to pause between signs. This limitation significantly slows down face-to-face conversations, making conversations difficult and awkward. Our technology is also non-intrusive, unlike other interpreter technologies that require signers to wear gloves, making them feel marginalized because you can literally see their disability.”
Beyond its ability to help the hard-of-hearing communicate to others, DePaul can help those virtually learning ASL by giving real-time feedback on their signing. Prior technology, Zhang explained, through video tutorials had limited personal assistance.
“Our technology can gauge their signing to help them learn and improve.”
Scientists are now planning to bring their technology in commercialization, making it available to the hundreds of thousands of people who need a more accessible interpreter.