Boston UniversityAmerican Sign Language Lingustic Research Project

Please pardon our appearance while this site undergoes reconstruction...

SignStream

   

SignStream project
   News and reviews
   Screenshots
   Documentation
   Product comparison
   Development
Get SignStream
  
System requirements
   Registering
   SignStream CDs
Transcripts and videos
Online data repository
Links
   Related projects
   References
Communicate

Related projects at Boston University

American Sign Language Linguistic Research Project

Investigation of the syntactic structure of ASL, with particular emphasis on the hierarchical representation of functional categories. Recent work has explored clausal structure: both the manual and non-manual expressions found with question phrases, tense, aspect, negation, and agreement.

Image and Video Computing Group

Research in the computer vision fields of person, head and skin tracking, and gesture recognition.

National Center for Sign Language and Gesture Resources

The goal of this project is to make available several different types of experimental resources and analyzed data to facilitate linguistic and computational research on signed languages and the gestural components of spoken languages. Collection of data from native users of ASL by means of synchronized digital video cameras, to capture multiple angles of the signing, is ongoing.

Related projects elsewhere

Tools, formats and data models for annotation of linguistic signals:
http://www.ldc.upenn.edu/annotation/

ISLE Survey of Existing Tools, Standards and User Needs for Annotation of Natural Interaction and Multimodal Data:
http://isle.nis.sdu.dk/reports/wp11/

Gesture annotation page:
http://morph.ldc.upenn.edu/annotation/gesture/

Sign Language Transcription Conventions for the ECHO Project (mouth movements):
http://www.let.kun.nl/sign-lang/echo/docs/ECHO_transcr_mouth_SSL.pdf

Computational projects related to recognition of signed languages:

Web resources:
http://www.cse.unsw.edu.au/~waleed/gsl-rec/

Computer vision links:
http://www.cs.cmu.edu/afs/cs/project/cil/ftp/html/txtv-pubs.html

Towards American Sign Language Recognition from Visual Input (Dimitris Metaxas). See also Christian Vogler's research page.

Real-Time American Sign Language Recognition from Video Using Hidden Markov Models (Starner and Pentland):
http://www.cc.gatech.edu/fac/Thad.Starner/

Research and development on sign language recognition and synthesis in Japan

Temporal setmentation of finger spelling

GRASP - Recognising Auslan signs using Instrumented Gloves

TWL - Gesture recognition (Rung-Huei Liang)

Sentence recognition and Understanding (Annelies Braffort)

http://www.cybernet.com/~ccohen/ Gesture Recognition page

Dictionary projects:

Multimedia Dictionary of American Sign Language: http://www.unm.edu/~wilcox/research/MM-DASL/mmdasl.html

Written systems for signed languages:

SignWriting: http://www.SignWriting.org/

HamNoSys ("a 'phonetic' transcription system... in the tradition of Stokoe-based systems"):
http://www.sign-lang.uni-hamburg.de/Projects/HamNoSys.html

Other sites with lots of links related to signed languages

Sign Language Sites on the World Wide Web:
http://www.leidenuniv.nl/hil/sign-lang/sl-sites.html

Other sign language links

Many related links:
http://www.stud.uni-hamburg.de/users/west/Deaf/links-e.html

References

For information about ASLLRP publications, see http://www.bu.edu/asllrp/publications.html.