A new article is out!
Marucci, M., Di Flumeri, G., Borghini, G., Sciaraffa,, N., Scandola, M., Pavone, E.F., Babiloni, F., Betti, V., & Aricò, P. (2021). The impact of multisensory integration and perceptual load in virtual reality settings on performance, workload and presence. Scientific Reports, 11, 4831. https://doi.org/10.1038/s41598-021-84196-8
Real-world experience is typically multimodal. Evidence indicates that the facilitation in the detection of multisensory stimuli is modulated by the perceptual load, the amount of information involved in the processing of the stimuli. Here, we used a realistic virtual reality environment while concomitantly acquiring Electroencephalography (EEG) and Galvanic Skin Response (GSR) to investigate how multisensory signals impact target detection in two conditions, high and low perceptual load. Different multimodal stimuli (auditory and vibrotactile) were presented, alone or in combination with the visual target. Results showed that only in the high load condition, multisensory stimuli significantly improve performance, compared to visual stimulation alone. Multisensory stimulation also decreases the EEG-based workload. Instead, the perceived workload, according to the “NASA Task Load Index” questionnaire, was reduced only by the trimodal condition (i.e., visual, auditory, tactile). This trimodal stimulation was more effective in enhancing the sense of presence, that is the feeling of being in the virtual environment, compared to the bimodal or unimodal stimulation. Also, we show that in the high load task, the GSR components are higher compared to the low load condition. Finally, the multimodal stimulation (Visual-Audio-Tactile—VAT and Visual-Audio—VA) induced a significant decrease in latency, and a significant increase in the amplitude of the P300 potentials with respect to the unimodal (visual) and visual and tactile bimodal stimulation, suggesting a faster and more effective processing and detection of stimuli if auditory stimulation is included. Overall, these findings provide insights into the relationship between multisensory integration and human behavior and cognition.
Our latest paper is finally out: Virtual Reality, Electroencephalography (EEG), and Galvanic Skin Response (GSR) to study how multisensory signals impact target detection in a naturalistic driving task https://t.co/zXCGwhDg5u @SciReports @dip_psi @SantaLuciaIRCCS @CosyncLab https://t.co/oJCGriDjSu— Viviana Betti (@Viviana_Betti) March 6, 2021