Brain Awareness Video Contest

Why Do We Gesture While Speaking?

  • Published5 Oct 2021
  • Source BrainFacts/SfN

We all tend to gesture when we speak. As it turns out, this spontaneous behavior, which has intrigued scientists for decades, helps both the speaker and the listener.

This is a video from the 2021 Brain Awareness Video Contest.

Created by Stephen Theron-Grimaldi

CONTENT PROVIDED BY

BrainFacts/SfN

Although we see speech as our main way to interact, nonverbal behaviors also play a key role in humans’ communication.

Hi! Today, I am going to introduce you to one example of nonverbal behaviors: co-speech gestures. So, what are these exactly? Well, I am producing some right now! Simply put, co-speech gestures are spontaneous hand movements that accompany speech. 

Scientists have investigated communication in many different groups of humans and one particularly interesting finding came from observing the interactions of congenitally blind people. These individuals were born blind and therefore it was never possible for them to observe and learn gestures. Nevertheless, studies found that although congenitally blind persons gesture less compared to a non-blind person, they still gesture. And, by showing that gestures are likely to be driven by an innate instinct, these studies suggest that gestures must serve a function.

But why are we meant to gesture? One established benefit of co-speech gestures is that they help the person producing them. It was found that preventing someone from gesturing during communication impairs their speech fluidity and vocabulary compared to speakers allowed to gesture. Therefore, the production of gestures helps facilitate speech. When communicating, the areas involved in speech production — such as the Broca's area — are strongly connected with the areas involved in motor control located here in the motor cortex. The exact mechanism behind this phenomenon is still up for debate but it has been suggested that gestures may facilitate conceptualization which is our ability to verbally express our thoughts.

To question this theory, researchers investigated how the production of gestures corresponds to conceptualization difficulty. In an experiment, children performed a 'Piagetian conservation task' in which containers and their volume were manipulated. They were either asked to describe objects or to explain the consequences of the manipulations, the latter involving more complex conceptualizations. It was found that the explanations elicited more gestures than the descriptions despite using the same amount of speech. The authors then suggested that, by gesturing, the children explored ways to organize their thoughts which may have facilitated the way they produced their speech. 

So far, we have discussed how gestures benefit the speaker, but is that their only role? To answer this question, let's look at some more studies. It was observed that speakers produce less gestures when listeners cannot see them. This suggests that some gestures are meant to be seen, but for what purpose? Well, gestures can hold information — for example, visuospatial information like a location, a size, or a shape — and thus can be used as a tool to convey additional information to listeners and complement what is being said. To support this notion, researchers found that presenters tend to gesture more when lecturing to a novice audience as compared to an audience of experts.

But what are the concrete benefits for the listeners then? It was found that receiving information through both speech and gesture enhanced the listener's comprehension, compared to those who received information through speech only. Looking at the brain, both the speech sound and the gesture movements are processed at the same time and then integrated in brain regions such as the right auditory cortex and the left posterior superior temporal. This combination of information enables the listeners to draw a global understanding. 

Using gestures alongside speech may also be beneficial because information is transmitted from different modalities. On one hand, speech information uses auditory modality, but on the other hand, gesture information uses visual modality. The benefit of receiving information through multiple modalities has yet to be properly investigated in gesture research. However, we can still arrive at our own conclusions regarding how pleasant it is to watch while listening. Personally, I much prefer having visuals — otherwise, I would just have done a podcast.

Brain Awareness Video Contest

Submit a short video about any neuroscience topic for a chance to win $4,000 and a trip to SfN's Annual Meeting!

Learn More

BrainFacts Book

Download a copy of the newest edition of the book, Brain Facts: A Primer on the Brain and Nervous System.

Download

Educator Resources

Explain the brain to your students with a variety of teaching tools and resources.

Explore