TY - JOUR
T1 - Multimodal decoding of error processing in a virtual reality flight simulation
AU - Wimmer, Michael
AU - Weidinger, Nicole
AU - Veas, Eduardo
AU - Müller-Putz, Gernot R.
N1 - Publisher Copyright:
© The Author(s) 2024.
PY - 2024/12
Y1 - 2024/12
N2 - Technological advances in head-mounted displays (HMDs) facilitate the acquisition of physiological data of the user, such as gaze, pupil size, or heart rate. Still, interactions with such systems can be prone to errors, including unintended behavior or unexpected changes in the presented virtual environments. In this study, we investigated if multimodal physiological data can be used to decode error processing, which has been studied, to date, with brain signals only. We examined the feasibility of decoding errors solely with pupil size data and proposed a hybrid decoding approach combining electroencephalographic (EEG) and pupillometric signals. Moreover, we analyzed if hybrid approaches can improve existing EEG-based classification approaches and focused on setups that offer increased usability for practical applications, such as the presented game-like virtual reality flight simulation. Our results indicate that classifiers trained with pupil size data can decode errors above chance. Moreover, hybrid approaches yielded improved performance compared to EEG-based decoders in setups with a reduced number of channels, which is crucial for many out-of-the-lab scenarios. These findings contribute to the development of hybrid brain-computer interfaces, particularly in combination with wearable devices, which allow for easy acquisition of additional physiological data.
AB - Technological advances in head-mounted displays (HMDs) facilitate the acquisition of physiological data of the user, such as gaze, pupil size, or heart rate. Still, interactions with such systems can be prone to errors, including unintended behavior or unexpected changes in the presented virtual environments. In this study, we investigated if multimodal physiological data can be used to decode error processing, which has been studied, to date, with brain signals only. We examined the feasibility of decoding errors solely with pupil size data and proposed a hybrid decoding approach combining electroencephalographic (EEG) and pupillometric signals. Moreover, we analyzed if hybrid approaches can improve existing EEG-based classification approaches and focused on setups that offer increased usability for practical applications, such as the presented game-like virtual reality flight simulation. Our results indicate that classifiers trained with pupil size data can decode errors above chance. Moreover, hybrid approaches yielded improved performance compared to EEG-based decoders in setups with a reduced number of channels, which is crucial for many out-of-the-lab scenarios. These findings contribute to the development of hybrid brain-computer interfaces, particularly in combination with wearable devices, which allow for easy acquisition of additional physiological data.
UR - http://www.scopus.com/inward/record.url?scp=85191030178&partnerID=8YFLogxK
U2 - 10.1038/s41598-024-59278-y
DO - 10.1038/s41598-024-59278-y
M3 - Article
AN - SCOPUS:85191030178
SN - 2045-2322
VL - 14
JO - Scientific Reports
JF - Scientific Reports
IS - 1
M1 - 9221
ER -