Abstract
A better understanding how movements are encoded in electroencephalography (EEG) signals is required to develop a more natural control for motor neuroprostheses. We decoded imagined hand close and supination movements from seven healthy subjects and investigated the influence of the visual input. We found that motor imagination of these movements can be decoded from low-frequency time-domain EEG signals with a maximum average classification accuracy of 57.3 +/- 5.0%. The simultaneous observation of congruent hand movements increased the classification accuracy to 64.1 +/- 8.3%. Furthermore, the sole observation of hand movements yielded discriminable brain patterns (61.9 +/- 5.5%). These findings show that for low-frequency time-domain EEG signals, the type of visual input during classifier training affects the performance and has to be considered in future studies.
Originalsprache | englisch |
---|---|
Publikationsstatus | Veröffentlicht - 18 Sept. 2017 |
Veranstaltung | 7th Graz BCI Conference 2017: From Vision to Reality - Graz, Österreich Dauer: 18 Sept. 2017 → 22 Sept. 2017 |
Konferenz
Konferenz | 7th Graz BCI Conference 2017 |
---|---|
Land/Gebiet | Österreich |
Ort | Graz |
Zeitraum | 18/09/17 → 22/09/17 |
Fields of Expertise
- Human- & Biotechnology
Treatment code (Nähere Zuordnung)
- Basic - Fundamental (Grundlagenforschung)