Utilize este identificador para referenciar este registo: https://hdl.handle.net/1822/79501

TítuloSpatial and temporal (non)binding of audio-visual stimuli: effects on motor tracking and underlying neural sensory processing
Autor(es)Lapenta, Olivia Morgan
Clibborn, Ashleigh
Hammoud, Ayah
Keller, Peter E.
Nozaradan, Sylvie
Varlet, Manuel
DataMai-2022
CitaçãoLapenta, OM; Clibborn, A; Hammoud, A.; Keller, PE; Nozaradan, S.; Varlet, M. Spatial and temporal (non)binding of audio-visual stimuli: effects on motor tracking and underlying neural sensory processing. In: International Conference of Cognitive Neuroscience 2020, Helsinki, Finland, 2022.
Resumo(s)Objectives: Compare the steady-state evoked potentials (SSEPs) of spatially or temporally congruent and incongruent audio-visual stimuli and evaluate how congruency affects the motion tracking of visual stimuli. Research question: Does spatial or temporal congruency of audio-visual stimuli affect motion tracking and evoke differential SSEPs? Methods: We use EEG frequency-tagging techniques to investigate the selective neural processing and integration of visual and auditory information in the tracking of a moving stimulus and how spatial and temporal (in)congruency between the two modalities modulate these sensory neural processes and synchronization performance.Participants were instructed to track a red dot flickering at 15 Hz that oscillated horizontally with a complex trajectory on a computer screen by moving their index finger. An auditory pure tone with continuous pitch modulation at 32 Hz was presented with lateralised amplitude modulations in left and right audio channels (panning) that were, in Experiment 1, either spatially congruent or incongruent (same direction vs. opposite direction vs. no panning), and in Experiment 2, either temporally congruent or incongruent (no delay vs. medium or large delay), with the oscillating visual stimulus. Results: Both experiments yielded significant EEG responses at the visual (15 Hz) and auditory (32 Hz) tagging frequencies. Further, in Experiment 1 participants had lower performance and larger amplitudes at the auditory frequency during no panning condition. No significant correlation between the two measures was found. In Experiment 2 no changes in the amplitude of the EEG responses or in performance were found. Conclusion: The movement synchronization performance and the neural processing of visual and auditory information were not influenced by phase congruency manipulation. For spatial congruency, the moving auditory stimuli led to better performance, irrespective of congruency, when compared to the non moving sound. Importantly, there were no significant responses at 17 and 47 Hz corresponding to the intermodulation frequencies of 15 and 32 Hz, suggesting an absence of global integration of visual and auditory information. These results encourage further exploration of the conditions that may result in the selective processing of visual and auditory information and their integration in the motor tracking of moving environmental objects.
TipoPoster em conferência
URIhttps://hdl.handle.net/1822/79501
Arbitragem científicayes
AcessoAcesso aberto
Aparece nas coleções:CIPsi - Comunicações

Ficheiros deste registo:
Ficheiro Descrição TamanhoFormato 
Print_ICON_postersMT_final.pdfConferencePoster506,42 kBAdobe PDFVer/Abrir

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID