How listener experiences and expectations influence spoken word recognition

ajohns

***In mid-June I joined the Brandeis University Memory and Cognition Lab (PI: Arthur Wingfield) as a postdoc.***

Dissertation successfully defended on May 10!

I am interested in how listeners apply what they know about language, and the world around them, to understanding spoken words. An individual differences approach allows for a nuanced investigation of how cognitive abilities and expectations interact with language comprehension, and my dissertation investigates how such interactions might change in the aging population due to sensory declines. I am also exploring how listeners successfully comprehend words with non-standard phonemes, as produced by unfamiliar talkers. I have experience using using EEG, eye tracking, and reaction time measures in order to investigate the role of higher cognitive processes on word recognition.

Contact: alexis.johns@uconn.edu

Academic Training

I am a member of both the Language and Brain Lab (PI: Emily Myers), and the Computational Cognitive Neuroscience of Language lab (PI: Jim Magnuson). The dual lab training has emphasized the multiple processes involved in word recognition, from sound to phoneme mapping to word-level competition and response selection. Prior to entering graduate school I worked with David Ostry at Haskins Laboratories, investigating how orofacial movement affects listeners’ perception of speech and non-speech sounds. I was an undergraduate research assistant for Steven Luck at the University of California at Davis, where I learned EEG/ERP data collection methods. I also drove buses.

Dissertation committee

My dissertation committee is comprised of:

Emily Myers – click here to go to the LAB lab

Jim Magnuson – click here to go to the CCNL lab

Erika Skoe – click here to go to the Skoe lab

Eiling Yee – click here to go to the YAL lab

Rachel Theodore – click here to go to the SLAP lab