Seung-Hee Lee and Decision-Making in the Multisensory Universe
- Published12 Dec 2022
- Author Calli McMurray
- Source BrainFacts/SfN
We perceive our world through our senses. And when the environment activates multiple senses at once, our internal machinery tips the scale on which sense(s) to give more attention to: If you walk towards a dog and don’t see it move, but hear it begin to growl, your auditory and visual systems may have a brief battle as you decide whether to approach or avoid the animal.
Seung-Hee Lee, associate professor at the Korea Advanced Institute of Science and Technology (KAIST), studies this kind sensory perception. Her lab examines the brain circuits that allow animals to prioritize one sense over another within a multisensory environment, influencing their actions and decision-making. Lee spoke with BrainFacts.org about how she became interested in this particular area of neuroscience and what her lab at KAIST is investigating.
How did you get into neuroscience?
As an undergrad, I majored in biology and education at Seoul National University. At a student-organized academic symposium in my third year, the topic was memory. At the time, my future PhD mentor, Bong-Kiun Kaang, came to do a talk on the molecular mechanisms of learning and memory. In education, when we learned about memory, there was a focus on teaching people. So when I heard this talk looking at memory at a molecular level, I thought it was fascinating. I look back on this talk as my “ah-ha” moment for wanting to get into science. After my undergraduate studies, I was fortunate to join Kaang’s lab and from then got into neuroscience in graduate school.
How did you come to study perception?
After my graduate studies, which focused on the molecular mechanisms of learning and memory, I wanted to do more system-level research. I believe that understanding the brain requires understanding both the brain circuits and brain systems. So, in 2007, after earning my PhD at Seoul National University in Korea, I attended a RIKEN summer course in Japan. The summer course gathers students from all over the world for about two weeks and hosts workshops and talks each day. There, I was drawn to topics on perception. But by that time, I had switched around the field so much, and did not necessarily know the basics of neuroscience systems and perception. Luckily, Yang Dang was brave and accepted me despite all the switches and offered me a postdoc position in her lab working on visual perception. Now, I run my own lab at KAIST studying sensory perception.
What are you focusing on in your lab?
Our aim is to understand how animals’ brains take sensory information about the external world, like sights and sounds, and find out how that process can switch their behavioral states from inattentive to attentive. By studying the sensory stimuli that animals understand and focus on, called percepts, we use the mouse model to see how specific brain circuits process, alter, and integrate sensory information, influencing their cognitive and perceptual behaviors. We’re also looking into how multisensory information is integrated in mouse models of autism and schizophrenia.
Recently, our lab has been investigating how mice integrate audio and visual information for decision-making. In a natural environment, there are many kinds of stimuli that animals take in and filter out to make decisions. For humans, the McGurk effect explains that when we receive conflicting audio and visual information, the human brain naturally resolves it by relying more on vision. But this can depend on age and the individual; children tend to resolve this conflict more often by relying on their hearing. So, we examined how such changes in multisensory integration happens in mice.
How did your lab study this in mice?
We trained mice to make decisions that led to rewards in response to different sets of auditory and visual stimuli. We then tested which sensory modality the mice chose to rely on when presented together with conflicting decisions for rewards.
In a head-fixed condition, we found that mice tend to rely more on their hearing versus their vision when making these decisions. We traced neurons from their visual and auditory cortices and found that the posterior parietal cortex receives and integrates both forms of sensory information in mice. And we found one important way mice put more weight into sounds over sights was through inhibitory interneurons suppressing visual information. This inhibition by interneurons in the cortex plays a large role in how animals can integrate and prioritize information across different senses, helping their learning and decision-making in a lab setting.
But in a similar experiment where mice could freely move, some of our new research suggests that they rely more on their vision. These findings mixed with our preliminary findings suggest that the behavior states of animals are critical for them to resolve multisensory conflicts, making more optimal decisions depending on their own states. We’ve recently better understood how the frontal cortical regions associated with vision, notably the anterior cingulate cortex (ACC), help to coordinate and execute accurate decisions about future movements by suppressing nearby motor neurons in mice. In the future, we are planning to do more experiments on how the parietal cortex interacts with the frontal cortex to make decisions following visual information in a freely moving state.
CONTENT PROVIDED BY
BBC. (2010). Try this bizarre audio illusion! 👁️👂😮 - BBC. [Video]. YouTube. https://www.youtube.com/watch?v=G-lN8vWm3m0
Kim, J. H., Jung, A. H., Jeong, D., Choi, I., Kim, K., Shin, S., Kim, S. J., & Lee, S. H. (2016). Selectivity of Neuromodulatory Projections from the Basal Forebrain and Locus Ceruleus to Primary Sensory Cortices. The Journal of Neuroscience : the official journal of the Society for Neuroscience, 36(19), 5314–5327.https://doi.org/10.1523/JNEUROSCI.4333-15.2016
Kim, J. H., Ma, D. H., Jung, E. et al. (2021). Gated feedforward inhibition in the frontal cortex releases goal-directed action. Nat Neurosci, 24, 1452–1464. https://doi.org/10.1038/s41593-021-00910-9
Song, Y. H., Hwang, Y. S., Kim, K., Lee, H. R., Kim, J. H., Maclachlan, C., Dubois, A., Jung, M. W., Petersen, C., Knott, G., Lee, S. H., & Lee, S. H. (2020). Somatostatin enhances visual processing and perception by suppressing excitatory inputs to parvalbumin-positive interneurons in V1. Science advances, 6(17), eaaz0517. https://doi.org/10.1126/sciadv.aaz0517
Song, Y. H., Kim, J. H., Jeong, H. W., Choi, I., Jeong, D., Kim, K., & Lee, S. H. (2017). A Neural Circuit for Auditory Dominance over Visual Perception. Neuron, 93(4), 940–954.e6. https://doi.org/10.1016/j.neuron.2017.01.006