Amy M. Lieberman
Amy Lieberman is an Assistant Professor in the Boston University Wheelock College of Education & Human Development and Director of the Language Acquisition and Visual Attention Lab. Her research focuses on the acquisition and processing of American Sign Language (ASL) in deaf individuals, and the development of visual attention in deaf children. Her research employs multiple approaches to studying language and attention, ranging from naturalistic observations of parent-child interactions to the development of a novel eye-tracking paradigm to investigate real-time processing of ASL in deaf children and adults.
Prior to joining the BU faculty, she was at the University of California, San Diego, where worked as a Research Scientist at the Center for Research on Language and the Mayberry Lab for Multimodal Language Development. She previously worked as an Early Childhood teacher at the California School for the Deaf, Fremont, and at Kendall Demonstration Elementary School at Gallaudet University’s Clerc Center.
Appointments, Affiliations & Committees:
- BU Department of Speech, Language, and Hearing Sciences at Sargent College
- Affiliated faculty, BU Linguistics Department
- Early Childhood Task Force, Massachusetts Commission of the Deaf/Hard of Hearing and MA Department of Elementary and Secondary Education
- Steering Committee, Massachusetts Commission of the Deaf/Hard of Hearing and MA Department of Elementary and Secondary Education
- Boston University Conference on Language Development (faculty advisor)
Ph.D. in Special Education, University of California, Berkeley and San Francisco State University
M.A. in Education (Cognition and Development), University of California, Berkeley
B.A. in Human Biology, Stanford University
SED DE 575: Language and the Deaf Child
SED DE 576: Advanced Language and the Deaf Child
Children learn new words when they can connect the language they hear to the objects and events they see and touch in the surrounding world. For deaf children learning sign language, both language input and information about the world are perceived through the visual modality. This means that deaf children must learn to skillfully alternate their gaze between people and things. My research aims to understand how deaf children acquire the ability to alternate visual attention in a way that maximizes language learning. I also investigate how the amount and quality of parents’ language input impacts children’s sign language acquisition.
It is well established that deaf children without access to language from an early age are at-risk for delays in language, literacy, and other academic outcomes throughout their school years and beyond. Despite the recognized importance of early language acquisition as a foundation for later learning, little is known about how sign language is processed by deaf children, and how processing efficiency may affect the development of vocabulary and other linguistic skills. These issues are particularly important in light of the fact that the vast majority of deaf children are born to hearing parents and thus are not exposed to sign language from birth. My research aims to understand how sign language is acquired and processed both by typical learners and by those who acquire language under atypical circumstances. My work is funded by the National Institutes of Health/National Institute on Deafness and Communication Disorders (R01DC015272).
Read more about my research and my lab, the Language Acquisition and Visual Attention lab, here.
What does joint attention look like in interactions between deaf children and their parents?
Joint attention refers to moments within an interaction when children and their parent or caregiver are both attending to the same thing at the same time. In spoken language, joint attention occurs when parents and children are looking at an object, and parents label or talk about it. How does this multi-modal process adapt when all input (language and information about the world) is perceived visually? We examine interactions between deaf children and their deaf and hearing caregivers to see how joint visual attention is achieved. We do this by recording parents and children during naturalistic play, and then coding these interactions to understand how eye gaze, handling objects, and language input are coordinated.
How do deaf children learn new ASL signs?
Early childhood is a time of rapid word learning. How do children map new labels to new objects? We study the process of word learning in ASL. In particular, we investigate how deaf children learn to manage their eye gaze and visual attention so that they can connect language and objects. Our studies use eye-tracking technology, which allows us to monitor children’s gaze as they perceive signs, pictures, and videos on a computer. We also record parents and children interacting with novel objects to determine how new labels are introduced during naturalistic play.
How do we know if deaf children are reaching their language milestones?
Vocabulary development in young childhood is an important predictor of later language outcomes. Yet, few measures exist to track the development of ASL in very young children. With our collaborators Dr. Naomi Caselli and Dr. Jennie Pyers, we are developing measures of productive and receptive language for use with infants, toddlers, and children learning ASL.
How do adult ASL-signers process language?
We study ASL production, comprehension, and processing in deaf adults from a range of backgrounds. We are interested in how adults process ASL as they are perceiving signs; how ASL phonology and semantics influence comprehension; and how signers choose to express various concepts in ASL. We use a range of approaches, primarily eye-tracking, to understand adult ASL perception and processing.
Professional Development for Teachers of D/HH students
The education of multilingual students has become increasingly important and relevant given the changing demographics of American schools. For Deaf and Hard-of-Hearing students, schools are increasingly recognizing the benefits of a bilingual or dual-language approach, in which students are instructed with the goal of achieving proficiency in both ASL and English. In collaboration with a large-scale project led by Dr. Kara Viesca at the University of Nebraska Lincoln, we are developing new workshops that will be designed specifically for teachers and other professionals working with deaf and hard of hearing students from a bilingual or dual language perspective. More information about this project is available at http://cehs.unl.edu/icmee/.Visit Dr. Lieberman's Faculty Profile
Visit the Language Acquisition and Visual Attention Lab's Website
Research Update: Dr. Naomi Caselli’s Team Extends Investigation into Language Deprivation in Deaf Children
Lieberman, A.M., Borovsky, A., & Mayberry, R. I. (2017). Prediction in a visual language: real-time sentence processing in American Sign Language across development. Language, Cognition, & Neuroscience.
Higgins, M., & Lieberman, A. M. (2016). Deaf Students as a Linguistic and Cultural Minority: Shifting Perspectives and Implications for Teaching and Learning. Journal of Education, 96(1), 9-18.
Lieberman, A. M., Borovsky, A., Hatrak, M., & Mayberry, R. I. (2015). Real-time processing of ASL signs: Delayed first language acquisition affects organization of the mental lexicon. Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol 41(4), 1130-1139.
Lieberman, A. M., & Mayberry, R. I. (2015). Studying sign language acquisition. In E. Orfanidou, B. Woll, & G. Morgan (Eds.) Research Methods in Sign Language Studies: A Practical Guide (pp. 281-299). Malden, MA: Wiley.
Lieberman, A. M., Hatrak, M., & Mayberry, R. I. (2014). Learning to look for language: Development of joint attention in young deaf children. Language Learning and Development, 10, 19-35.
Lieberman, A. M. (2014). Attention-getting skills of deaf children using American Sign Language in a preschool classroom. Applied Psycholinguistics, 1-19. doi: 10.1017/S0142716413000532.
Ferjan Ramirez, N., Lieberman, A.M., & Mayberry, R.I. (2013). The initial stages of language acquisition begun in adolescence: When late looks early. Journal of Child Language, 40(2), 391-414.
Mayberry, R. I., del Giudice, A. A., & Lieberman, A. M. (2011). Reading achievement in relation to phonological coding and awareness in deaf readers: A meta-analysis. Journal of Deaf Studies and Deaf Education, 16, 2, 164-188.
Lieberman, A. M. (2018). Learning Language in the Visual World: How Interaction Shapes Early Word Learning in Young Deaf Children. Presented at the Chicago Education Workshop Lecture Series, University of Chicago, Chicago, IL
Caselli, N., Pyers, J., & Lieberman, A. M. (2018). ASL Vocabulary Assessment. Paper presented at the 43rdBoston University Conference on Language Development.
Fitch, A., Arunachalam, S., & Lieberman, A. (2018). Learning words from context in ASL: Evidence from a Human Simulation Paradigm. Poster presented at the 43rdBoston University Conference on Language Development.
Brown, L., Lieberman, A. M., & Gagne, D. (2018). Modality of communication in parents & their deaf children using both ASL & Spoken English. Paper presented at the American Speech-Language-Hearing Association (ASHA) Convention, Boston, MA.
Johnson, E., Schotter, E., & Lieberman, A. (2018). Investigating the Sources of Deaf Signers’ Enhanced Peripheral Attention: ASL Experience and Deafness.Poster presented at the 59thAnnual Psychnonomic Society Meeting, New Orleans, LA.
Lieberman, A. M. & Wienholz, A. (2018). Semantic processing of American Sign Language sentences: effects of ambiguity and word order. Poster presented at the Workshop on Reading, Language, and Deafness, San Sebastian, Spain.
Lieberman, A. M., Borovsky, A., Bottoms, A., & Fieldsteel, Z. (2018). Referential cues support novel sign learning in young deaf children. Paper presented as part of the symposium Looking for Language: How hearing and deaf infants navigate the visual world to learn language. International Congress of Infant Studies, Philadelphia, PA.
Bottoms, A., Fieldsteel, Z., Spurgeon, E., & Lieberman, A. (2017). Object and event labelling in American Sign Language input to young deaf children. Poster presented at the 42ndBoston University Conference on Language Development, Boston, MA.
Lieberman, A. M. (2016). Development of gaze control for integration of language and visual information in deaf children. NSF-fundedU.S.-Swedish Workshop on Assessment of Multimodal Multilingual Outcomes in Deaf and Hard-of-Hearing Children, Stockholm, Sweden.
Lieberman, A.M., Borovsky, A., & Mayberry, R. I. (2015). The predictive nature of American Sign Language verbs during real-time sentence processing in deaf adults and children. Paper presented at the 40thBoston University Conference on Language Development.
Ferjan Ramirez, N., Lieberman, A. M., & Mayberry, R. I. (2013). How far and how fast? A longitudinal study of ASL acquisition in adolescent home signers. Paper presented at the Theoretical Issues in Sign Language Research Conference (TISLR 11), London.