The ASLLRP includes:

    • Investigation of the linguistic structure of American Sign Language, with a focus on the relationship of syntax to semantics and prosody;

    • Collaboration with computer scientists to advance computer-based recognition and generation of signed languages;

    • Development of multimedia tools to facilitate access to and analysis of primary data for sign language research.

These projects have been funded by the National Science Foundation.

 


Overview of ASLLRP Resources:

https://www.bu.edu/asllrp/ASL-SignBank-and-other-Resources.html


Major updates:

New data !

Lots of new data has been added to our publicly accessible corpus of continuous signing available through our Data Access Interface (DAI 2) and to our ASLLRP ASL Sign Bank. And new options to search and download ASLLRP ASL Sign Bank data have been added!



May 2025: A new update for SignStream® --
version 3.5.1 is now AVAILABLE !!

It has many new enhancements and bug fixes. The ASL Sign Bank, accessible from within the application, also allows for easy direct accesss to the online version, with many more examples for each sign.


A new version of our Data Access Interface (DAI 2) and a new, recently expanded, collection of linguistically annotated data from the ASLLRP SignStream® 3 Corpus is also NOW AVAILABLE !

DAI 2 also provides access to our new ASLLRP ASL   Sign Bank.

See especially:   

 Neidle, Carol, Augustine Opoku, Carey Ballard, Yang Zhou, Xiaoxiao He & Dimitris Metaxas. 2024. New Capability to Look Up an ASL Sign from a Video Example. arXiv 2407.13571 [cs.CV]. 1-11. https://arxiv.org/abs/2407.13571.

 Neidle, Carol  & Augustine Opoku, A Guide to the ASLLRP Sign Bank – New Search Feature.  ASLLRP Report No. 25, Boston University, Boston, MA.    https://www.bu.edu/asllrp/rpt25/asllrp25.pdf.

and the references below.



Data Available from various related projects

 

 

Please read carefully the terms of use for ASLLRP continuous signing data:

http://www.bu.edu/asllrp/dai-terms.html

Note, in particular, that the data available from these pages can be used for research and education purposes, but cannot be redistributed without permission.

Commercial use, without explicit permission, is not allowed, nor are any patents and copyrights based on this material.

By accessing data from this site, you agree to the above terms of use.

 

 

(1) Web Access to Linguistically Annotated Corpora: the ASLLRP DAI (Data Access Interface)

A. Web access to the National Center for Sign Language and Gesture Resources (NCSLGR) corpus: linguistically annotated ASL data (continuous signing), with multiple synchronized video files showing views from different angles and a close-up of the face and linguistic annotations available as XML.

Information about the data collection and annotation, and about development of the Web interface

Information about download: http://www.bu.edu/asllrp/ncslgr-for-download/download-info.html .

See also: C. Neidle and C. Vogler (2012), A New Web Interface to Facilitate Access to Corpora: Development of the ASLLRP Data Access Interface (DAI)," 5th Workshop on the Representation and Processing of Sign Languages: Interactions between Corpus and Lexicon, LREC 2012, Istanbul, Turkey, May 27, 2012

B. Handshapes (with videos showing multiple angles of the hands in moti.on) and our labelling conventions for handshapes:

http://www.bu.edu/asllrp/cslgr/pages/ncslgr-handshapes.html.

C. A new, enhanced Data Access Interface, DAI 2, is also now available. This provides many new features for displaying information from SignStream® 3 files, including start and end handshapes. This interface provides access to a new ASLLRP SignStream® 3 Corpus. DAI 2 also provides access to our new ASLLRP ASL Sign Bank.

See ASLLRP Project Report No. 18:   

Neidle, C. and A. Opoku (2020), A User's Guide to the American Sign Language Linguistic Research Project (ASLLRP) Data Access Interface (DAI) 2 — Version 2 [pdf - 11 MB]

New data is now available

See : Neidle, C., A. Opoku, and D. Metaxas, ASL Video Corpora & Sign Bank: Resources Available through the American Sign Language Linguistic Research Project (ASLLRP) - https://arxiv.org/abs/2201.07899



(2) Software for Linguistic Annotation and Analysis of Visual Language Data

The data collection listed above in (1)-A was created using SignStream® 2.2.2 (runs as a Classic application on older Macintosh systems) for linguistic annotation. More recent data, released through the DAI 2, include annotations that were carried out using SignStrea version 3: http://www.bu.edu/asllrp/SignStream/3/, a Java application with many new features, first released in August 2017.

For user guides, see: https://www.bu.edu/asllrp/reports.html

(3) Annotation Conventions

Anotation conventions are documented in these two reports:

C. Neidle (2002) "SignStream™ Annotation: Conventions used for the American Sign Language Linguistic Research Project," American Sign Language Linguistic Research Project, Report 11, Boston University, Boston, MA.

C. Neidle (2007), "SignStream™ Annotation: Addendum to Conventions used for the American Sign Language Linguistic Research Project," American Sign Language Linguistic Research Project, Report 13, Boston University, Boston, MA.



(4) ASLLRP ASL Sign Bank

Integrated into SignStrea version 3 and DAI 2 is a Sign Bank, where individual signs from our data collections are stored, along with their morpho-phonological features. This greatly speeds annotation, as previously annotated signs can be retrieved along with their relevant properties (which can be further edited in case of variations in prodcution). This also helps to ensure consistency in labeling.

The Sign Bank incorporates the data from the American Sign Language Lexicon Video Dataset (ASLLVD), which is a collection of almost 10,000 examples (of about 3,000 distinct signs, each produced by between 1 and 6 ASL signers) based largely on the entries in the Gallaudet Dictionary of American Sign Language. It also incorporates examples from the continuous signing data in the ASLLRP Signstream® 3 Corpus.

See: Carol Neidle, Augustine Opoku, and Dimitris Metaxas (2022) ASL Video Corpora & Sign Bank: Resources Available through the American Sign Language Linguistic Research Project (ASLLRP).  arXiv.org: https://arxiv.org/abs/2201.07899 . 1-20. 

BU Open Accesshttps://hdl.handle.net/2144/44189


(4) Tracking of Non-manual Features (head movement, facial expressions) and Detection of Corresponding Linguistic Information

In collaboration with Dimitris Metaxas, et al., at Rutgers University, we are conducting research on tracking and 3D modeling of non-manual events (head positions and movements, and facial expressions) that convey essential grammatical information in signed languages. Visualizations of the ASL non-manual feature tracking and detection of the associated linguistic information are available here: http://www.bu.edu/av/asllrp/NM/ .

 


 

For lots more information about ASLLRP research and publicly shared resources, see:

http://www.bu.edu/asllrp/talks.html

http://www.bu.edu/asllrp/reports.html