02. 23. 2007
Sign language over cell phones
While the development of cell phone technology marches on, to the point where owning a cell phone's become practically mandatory for anyone living in a modern urban society, there's only so much a hearing-impaired person can do with a cell phone. Texting is well and good, but that comes with the usual technological and sociological limitations of SMS.
Hence the Mobile ASL project at the University of Washington, which aims to develop a new real-time video compression scheme that can transmit within existing networks, while retaining sufficient video quality to enable viewers to follow sign language movements in a video call.
The problem with existing networks is that the bandwidth is typically too low to send video files that accurately capture the various sign language gestures. Working within network constraints, Mobile ASL researchers Professors Eve Riskin, Sheila Hemami and Richard Ladner have instead devised a new compression software. It uses skin detection algorithms to zoom in on those specific areas in the video that contain essential movements used to communicate via ASL - typically, hand, face and arm movements.
BBC reports that the researchers are currently in talks with handset makers and operators to make the software available on cell phones. Hopefully it won't be long now before this becomes a standard accessibility feature that turns "making a phone call" into everyday parlance for the deaf.