Spatial attention and perceptuo-motor representations of demonstratives: A fast fmri study using naturalistic auditory stimuli

Research output: Contribution to conferencePosterResearchpeer-review

Spatial demonstratives, i.e. words like this and that, are lexical items used to indicate physical or contextual distance. In spite of their minimal semantic specificity, they can trigger attentional shifts and establish a joint focus of attention on referents in the physical environment, thus functioning as interfaces between linguistic representations, attention and perceptual processes. While the behavioral correlates of spatial demonstratives are receiving more and more attention in the experimental literature, no research has been conducted on how this peculiar intertwining between linguistic, attentional and perceptual processes is implemented in the brain. This may be due to the fact that studying demonstratives raises methodological challenges. As their meaning hinges on the context of utterance, attempts at investigating their neural underpinnings call for the need to simulate rich linguistic and physical environment within the constraints intrinsic to neuroimaging. With these challenges in mind, we conducted a naturalistic fMRI experiment (N = 28) where participants listened to specially crafted dialogues with a controlled number of spatial demonstratives (as well as a number of other function words). The dialogue involved two synthesized voices, each recorded onto a separate channel of a stereo track. This allowed to embed the target words in both a rich linguistic context, and a 3D-like spatial setting. A fast acquisition sequence (TR = 388ms, multi-band EPI acquisition) was used to capture signal changes at word-level resolution, relying on evidence for the presence of high-frequency components in the BOLD signal (Lewis et al., 2016). We isolated regions involved in processing spatial demonstratives via random effects univariate analyses. Given our liberal assumptions on the shape of the hemodynamic response for sustained linguistic stimulation, we modelled neural response using finite impulse response models and used RETROICOR cardiac and respiratory models for denoising. We found bilateral posterior superior parietal activation in response to spatial demonstratives in areas associated to attentional orienting and functional representation of space, with activation being significantly stronger for distal than for proximal demonstratives. These results are compatible with behavioral evidence showing that spatial demonstratives are likely to encode the attentional status of the referent, as well as its functional perceptual-motor features (e.g. graspability). In order to confirm the reliability of these patterns, we computed inter-subject correlations both on the full time series and across time windows. This allowed to explore which events generate peaks in activation within our ROI, so to elucidate the functional profile of these areas in the context of sustained linguistic stimulation. Additionally, we submitted the parameter estimates from the univariate model to multivariate pattern analysis, so to identify patterns specific to the representation of spatial demonstratives compared to other types of referencing expressions. Our results contribute to establishing a grounding of neural representations for spatial demonstratives onto non-linguistic perceptual and attentional resources. They also contribute to validating fast fMRI paradigms using naturalistic auditory stimuli as a reliable experimental procedure to investigate language phenomena at short time scales, within rich contexts and at a computationally sustainable cost.
Original languageEnglish
Publication year19 Aug 2018
Number of pages1
Publication statusPublished - 19 Aug 2018
EventSNL 2018 - Society for the Neurobiology of Language - Quebec City, Canada
Duration: 16 Aug 201818 Aug 2018

Conference

ConferenceSNL 2018 - Society for the Neurobiology of Language
CountryCanada
CityQuebec City
Period16/08/201818/08/2018

See relations at Aarhus University Citationformats

ID: 131662673