It is an American initiative, but people from all over the world can lend a hand. Literally, because a new open-source library for photos of hands forms the basis for AI to learn sign language.
The American Society for Deaf Children, in partnership with Hello Monday, a division of Dept, is launching the GiveAHand.ai site. It claims to be the world’s largest open-source image bank.
Anyone who wants to can take a picture of a hand with a single mouse click and send it to the site. An AI model studies that hand first and then draws lines that indicate how the fingers run. People then check and tag that data to build better hand models. That human processing step is the point that meaning is added to the images.
The fully crowdsourced data forms a diverse dataset of hands with different shapes, positions, colors, backgrounds and gestures. Researchers can download these tagged images and use them to improve their machine learning models. This can make recognizing and translating the full spectrum of sign language much faster.
The goal is to create a real-time sign language translation machine.
Give a Hand – Hello Monday from DEPT® on Vimeo.
GiveAHand.ai is the second collaboration between Hello Monday and ASDC. In 2021, they launched Fingerspelling.xyz: a “hand tracker” that uses machine learning to aid in learning the sign language alphabet. Currently, more than 5.1 million correct signatures have been registered. The ASDC now uses Fingerspelling.xyz as part of its own training materials.