Prof. Carol Neidle received a grant through the NSF Convergence Accelerator program: https://beta.nsf.gov/funding/initiatives/convergence-accelerator (2022 Cohort, Phase I, Track H: Enhancing Opportunities for Persons with Disabilities). This is a collaboration with Principal Investigators Dimitris Metaxas of Rutgers University and Matt Huenerfauth of Rochester Institute of Technology. As part of NSF’s commitment to accelerating use-inspired solutions for persons with disabilities, these researchers were awarded $750,000 for development of “AI-based Tools to Enhance Access and Opportunities for the Deaf.” The research team is composed of linguists, computer scientists, deaf and hearing experts on American Sign Language (ASL), and industry partners.
ASL is the primary means of communication for over 500,000 people in the United States. The project aims to develop sustainable, robust AI methods to overcome obstacles to digital communication and information access faced by Deaf and Hard-of-Hearing (DHH) individuals, empowering them personally and professionally. Users of American Sign Language (ASL), which has no standard written form, lack parity with hearing users in the digital arena.
The project builds on prior NSF-funded AI research on linguistically-informed computer-based analysis and recognition of ASL from videos. Specific goals include:
- Privacy protection for ASL video communication. ASL signers cannot communicate anonymously about sensitive topics through videos in their native language; this is perceived by the Deaf community to be a serious problem. The tools to be developed will enable signers to anonymize ASL videos while preserving essential linguistic information conveyed by hands, arms, facial expressions, and head movement.
- Video search-by-example for access to multimedia digital resources. There is no good way to look up a sign in a multimedia dictionary. Many ASL dictionaries enable sign look-up based on English translations, but what if the user does not understand the sign, or does not know its English translation? Others allow for search based on properties of ASL signs (e.g., handshape, location, movement type), but this is cumbersome, and a user must often look through hundreds of pictures of signs to find a target sign (if it is present at all in that dictionary). The tools to be developed will enable searching for a sign based on ASL input from a webcam or a video clip.
The proposed application development brings together state-of-the-art research on video anonymization; computer-based sign recognition from video; and Human Computer Interaction, including DHH user studies to assess desiderata for user interfaces for the proposed applications.
Carol Neidle is a Professor of general linguistics and French linguistics at Boston University. She received her BA at Yale College, her MA at Middlebury College, and her PhD at the Massachusetts Institute of Technology. Professor Neidle is the Director of the American Sign Language Linguistic Research Project (ASLLRP). Professor Neidle’s areas of interest for research include syntactic theory and the linguistic structure of American Sign Language.
The NSF Convergence Accelerator issued its first set of awards in 2019. This NSF program accelerates use-inspired, convergence research areas that are important nationally. The Convergence Accelerator seeks to create partnerships and bring together people from across disciplines and industry to address societal challenges and provide real solutions for the problems.