Huh? If a deaf person needs to look at a screen to see the signs then they could just as easily look at a screen to see the live speech to text of what's being said
Tech
A community for high quality news and discussion around technological advancements and changes
Things that fit:
- New tech releases
- Major tech changes
- Major milestones for tech
- Major tech news such as data breaches, discontinuation
Things that don't fit
- Minor app updates
- Government legislation
- Company news
- Opinion pieces
Not everyone of all ages and viewing distances can accurately read text on a screen in a particular language with enough speed.
Maybe it’s for hearing people who don’t know sign language?
Why not just do speech-to-text and display the text???
Sign language can be faster to interpret. Just how listening is faster than reading.
It's the difference between a claw and an articulated prosthetic hand.
The claw is better than nothing
The moving hand is much nicer, while still not being perfect.
Not the best metaphor, but hopefully gets my point across?
I'm very slowly learning ASL and signing is different than spoken English. Not everything gets signed, and it's often much quicker to sign back and forth than speak if two hearing people are fluent.
Mentioned in the article: this supports many african languages, including african sign languages. (There were plenty of apps that already did this for, say, ASL.)
Relatedly: it's possible to be deaf and illiterate. Similar to being able to speak but not read, these folks are much better served by sign language translation over text. (I can't find data on rates quickly, but see here for discussion of the general phenomena. It's harder to teach the deaf to read, so easier for them to fall between the cracks.)