During vocal communication, vocalized sounds are heard by both the intended recipients and the individual producing them. Neural encoding of this vocal feedback is thought to be crucial for monitoring one's own voice, and may play a role in feedback-dependent control of vocalization in both animals and humans. We study the role of the auditory cortex (AC) in sound analysis and its functional interaction with a region within the frontal lobe, the frontal auditory field (FAF), to produce and control vocalizations. In echolocating bats, the audio-vocal integrations are rapid, reliable and precise, and biosonar provides a solid theoretical framework around which to build and test new hypotheses.
Our goal is to understand how the AC (sensory center) and the FAF (sensorymotor center) interact with each other in real time to support vocal behavior and communication.
To address this question we are investigating:
how is sound information represented in the auditory cortex? How do local interactions between different neuron types in a cortical column contribute to sound processing?
what is the functional role of neural activity in frontal auditory fields in sound processing and vocalizations?
what is the functional role of the auditory-frontal connection in sound processing and vocalization? how the cortico-cortical interaction modulates both sound processing and vocal activity?