Visuospatial Working Memory and Understanding Co-Speech Iconic Gestures: Do Gestures Help to Paint a Mental Picture?
Multi-modal discourse comprehension requires speakers to combine information from speech and gestures. To date, little research has addressed the cognitive resources that underlie these processes. Here we used a dual-task paradigm to test the relative importance of verbal and visuospatial working me...
Saved in:
Published in: | Discourse processes Vol. 59; no. 4; pp. 275 - 297 |
---|---|
Main Authors: | , , |
Format: | Journal Article |
Language: | English |
Published: |
Routledge
21-04-2022
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Multi-modal discourse comprehension requires speakers to combine information from speech and gestures. To date, little research has addressed the cognitive resources that underlie these processes. Here we used a dual-task paradigm to test the relative importance of verbal and visuospatial working memory in speech-gesture comprehension. Healthy, college-aged participants encoded either a series of digits (verbal load) or a series of dot locations in a grid (visuospatial load) and rehearsed them (secondary memory task) as they performed a (primary) multi-modal discourse comprehension task. Regardless of the secondary task, performance on the discourse comprehension task was better when the speaker's gestures and speech were congruent than when they were incongruent. However, the congruity advantage was smaller when the concurrent memory task involved a visuospatial load than when it involved a verbal load. Results suggest that taxing the visuospatial working memory system reduced participants' ability to benefit from the information in congruent iconic gestures. A control experiment demonstrated that results were not an artifact of the difficulty of the visuospatial load task. Overall, these data suggest speakers recruit visuospatial working memory to interpret gestures about concrete visual scenes. |
---|---|
ISSN: | 0163-853X 1532-6950 |
DOI: | 10.1080/0163853X.2022.2028087 |