The ASLLRP includes:

    • Investigation of the linguistic structure of American Sign Language, with a focus on the relationship of syntax to semantics and prosody;

    • Collaboration with computer scientists to advance computer-based recognition and generation of signed languages;

    • Development of multimedia tools to facilitate access to and analysis of primary data for sign language research.

These projects have been funded by the National Science Foundation.


Major 2017 updates:


SignStream® version 3 - NOW AVAILABLE !!

 


A new version of our Data Access Interface (DAI 2) and a new collection of linguistically annotated data from the ASLLRP SignStream® 3 Corpus is also NOW AVAILABLE !

DAI 2 also provides access to our new ASLLRP Sign Bank.



Data Available from various related projects

Terms of use for ASLLRP data

The data available from these pages can be used for research and education purposes, but cannot be redistributed without permission.

Commercial use, without explicit permission, is not allowed, nor are any patents and copyrights based on this material.

Those making use of these data must, in resulting publications or presentations, cite: The National Center for Sign Language and Gesture Resources (NCSLGR) Corpus and this publication:

Carol Neidle and Christian Vogler [2012] "A New Web Interface to Facilitate Access to Corpora: Development of the ASLLRP Data Access Interface," Proceedings of the 5th Workshop on the Representation and Processing of Sign Languages: Interactions between Corpus and Lexicon, LREC 2012, Istanbul, Turkey.

and also include the following URL's: http://www.bu.edu/asllrp/ and http://secrets.rutgers.edu/dai/queryPages/.

By accessing data from this site, you agree to the above terms of use.

(1) Web Access to Linguistically Annotated Corpora: the ASLLRP DAI (Data Access Interface)

A. Web access to the National Center for Sign Language and Gesture Resources (NCSLGR) corpus: linguistically annotated ASL data (continuous signing), with multiple synchronized video files showing views from different angles and a close-up of the face and linguistic annotations available as XML.

Information about the data collection and annotation, and about development of the Web interface

Anotation conventions are documented in these two reports, and an updated version will be forthcoming in Spring 2012:

C. Neidle (2002) "SignStream™ Annotation: Conventions used for the American Sign Language Linguistic Research Project," American Sign Language Linguistic Research Project, Report 11, Boston University, Boston, MA.

C. Neidle (2007), "SignStream™ Annotation: Addendum to Conventions used for the American Sign Language Linguistic Research Project," American Sign Language Linguistic Research Project, Report 13, Boston University, Boston, MA.

See also: C. Neidle and C. Vogler (2012), A New Web Interface to Facilitate Access to Corpora: Development of the ASLLRP Data Access Interface (DAI)," 5th Workshop on the Representation and Processing of Sign Languages: Interactions between Corpus and Lexicon, LREC 2012, Istanbul, Turkey, May 27, 2012.

A new version of our Data Access Interface (DAI 2) and a large new collection of linguistically annotated data will also be released in Summer 2017.

B. There are additional data available from the American Sign Language Lexicon Video Dataset (ASLLVD), which is a collection of almost 10,000 examples (of about 3,000 distinct signs, each produced by between 1 and 6 ASL signers) based largely on the entries in the Gallaudet Dictionary of American Sign Language.

Handshapes (with videos showing multiple angles of the hands in motion) and our labelling conventions for handshapes:

http://www.bu.edu/asllrp/cslgr/pages/ncslgr-handshapes.html

C. Neidle (2002) "SignStreamSignStream™: A Database Tool for Research on Visual-Gestural Language." In Brita Bergman, Penny Boyes-Braem, Thomas Hanke, and Elena Pizzuto, eds., Sign Transcription and Database Storage of Sign Information, a special issue of Sign Language and Linguistics 4 (2001):1/2, pp. 203-214.

D. MacLaughlin, C. Neidle, and D. Greenfield (2000) "SignStreamSignStream™ User's Guide." American Sign Language Linguistic Research Project, Report 9, Boston University, Boston, MA.


Pending completion of the DAI download capabilities, see this page for access to complete sets of materials from the NCSLGR corpus (videos and annotations): http://www.bu.edu/asllrp/ncslgr-for-download/download-info.html .

 


 

C. A Beta version of a new, enhanced Data Access Interface, DAI 2, is also now available. This provides many new features for displaying information from SignStream® 3 files, including start and end handshapes. This interface provides access to a new ASLLRP SignStream® 3 Corpus. DAI 2 also provides access to our new ASLLRP Sign Bank (see (3) below).



(2) Software for Linguistic Annotation and Analysis of Visual Language Data

The data collection listed above in (1)-A was created using SignStream® 2.2.2 (runs as a Classic application on older Macintosh systems) for linguistic annotation. The data listed in (1)-B and data soon to be released through the DAI 2 include annotations that were carried out using SignStrea version 3, a Java application with many new features released in August 2017.



(3) ASLLRP Sign Bank

Integrated into SignStrea version 3 and DAI 2 is a Sign Bank, where individual signs from our data collections are stored, along with their morpho-phonological features. This greatly speeds annotation, as previously annotated signs can be retrieved along with their relevant properties (which can be further edited in case of variations in prodcution). This also helps to ensure consistency in labeling.


(4) Tracking of Non-manual Features (head movement, facial expressions) and Detection of Corresponding Linguistic Information

In collaboration with Dimitris Metaxas, et al., at Rutgers University, we are conducting research on tracking and 3D modeling of non-manual events (head positions and movements, and facial expressions) that convey essential grammatical information in signed languages. Visualizations of the ASL non-manual feature tracking and detection of the associated linguistic information are available here: http://www.bu.edu/av/asllrp/NM/ .