×
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT

Sign language gets a technical boost

Last Updated 23 April 2014, 15:51 IST

There are scribblings in blue ink on Santoshi’s palm. A look closer reveals it to be some kind of a mathematical calculation. 

Ask her what it is about and she gleefully replies, “These are some algorithm-related calculations for my project that would take sign language for the deaf and mute to a technical level, helping them to communicate easily.” 

Santoshi, a second year student of the BTech. Innovation with Mathematics and Information Technology, Cluster Innovation Centre (CIC), Delhi University, along with her classmate Vikas has created a sign language-recognition system where the camera could read the signs and form a word as well as complete sentences. 

“The computer recognises the gestures and converts into alphabets. Since each gesture corresponds to single alphabet as per the American Sign Language (ASL) it easily forms a word on the computer screen,” says Santoshi. 

Interestingly, you just need to show the sign infront of the camera and the alphabet is there on the computer screen within a second. 

“Initially we developed a glove which had various sensors in it. It was called 'data glove' which had five different colours at different angles of the hand which the camera could easily sense,” says Alok Nikhil Jha, assistant professor (IT and Innovation) CIC, who helped Santoshi and Vikas in the project.“But this method was not user-friendly. So, it was decided to create a technology that could work without sensors, colours and gloves; something that could capture the movement of bare hands. For this, we needed a camera intelligent enough to decode the gestures easily,” says Alok.

So, they removed the gloves and called it 'Data Hands'.  

“Now, we are using the external boundaries of the gestures for the recognition. The system is of no use if it is not able to make sentences out of the known gestures. 

So, there is a green rectangular box at the top of the screen, on which the alphabets recognised by the camera is jotted down.

”Notably, it becomes a lengthy process as gestures for each alphabet has to be made to form a word.  Therefore, the duo has been focussing on Indian Sign Language. 

“The process is obviously slow because it recognises only one alphabet at a time. What we are working upon is that it should be able to recognise a word as a whole. 

Therefore, we are focusing on the Indian Sign Language. If we are able to create a technology that can easily understand the Indian signs, it will be a boon for the disabled,” says Vikas.

Presently, both are working on improving the accuracy of the system. “For the system to directly read words  we need to create a huge data base. Also, we want to make it work on mobile devices,” says Santoshi. 

Meanwhile, the youngsters are focusing on promoting their project in different colleges where classes for differently able is being conducted.

ADVERTISEMENT
(Published 23 April 2014, 15:36 IST)

Follow us on

ADVERTISEMENT
ADVERTISEMENT