Local Miami Government Funds Test Of Real-Time, AI-based American Sign Language Messaging On Airport Digital Signage
August 19, 2024 by Dave Haynes
The Miami-Dade regional government is investing $100,000 into a UK company that will trial AI-driven text to sign language content on digital signage screens around Miami International Airport.
Signapse’s platform takes real-time data from airport information systems, converts it to text, and that text is converted to overlays or picture-in-picture windows on screens that have an interpreter relaying information in American Sign Language to hearing impaired passengers.
The solution was one of three selected to trial as part of Miami-Dade Innovation Authority’s second Public Innovation Challenge, with testing to start this year.
The other companies selected were winners:
- RouteMe, which does artificial intelligence that allows passengers to navigate the airport with their phone cameras;
- Mapsted, a Toronto-area company that provides airport wayfinding mapping with accessibility features and integration to smart technology devices.
The innovation authority, working in collaboration with the airport and the county’s innovation and economic development office, chose the companies out of a pool of 136 local and global technology firms.
I wrote back in April about how the UK’s Network Rail system teamed up with the Scottish branch of the British Deaf Association (BDA) to think through and develop screens now in place at the two busiest rail hubs in Scotland, Edinburgh Waverley and Glasgow Central. The screens were the first in the UK to have BSL on the main boards and sub boards as part of an ongoing investment to make Scotland’s Railway more accessible for deaf passengers.
If hearing is the only challenge for people who are deaf, they can of course see and read departures boards, but the problems this can help solve are things like live announcements for platform or gate changes.
In that case, the display are tapping into a library of messaging modules developed by Synapse. The company is using AI to generate the signing – using captured visuals of interpreters and stitching movements together to create smooth-looking videos of people – not avatars – who relay information to hearing impaired passengers. Using a Large Language Model means the solution is scalable, in a way not feasible – technically or financially – with staffed interpreters. Staff could do many messages in a day, but not 100s or 1,000s.
Cincinnati’s airport has already been using Synapse to provide live ASL updates on airport tram departure and safety information, and it is also in use at the airport in Grand Rapids, Michigan.
Leave a comment