Abstract
A better understanding how movements are encoded in electroencephalography (EEG) signals is required to develop a more natural control for motor neuroprostheses. We decoded imagined hand close and supination movements from seven healthy subjects and investigated the influence of the visual input. We found that motor imagination of these movements can be decoded from low-frequency time-domain EEG signals with a maximum average classification accuracy of 57.3 +/- 5.0%. The simultaneous observation of congruent hand movements increased the classification accuracy to 64.1 +/- 8.3%. Furthermore, the sole observation of hand movements yielded discriminable brain patterns (61.9 +/- 5.5%). These findings show that for low-frequency time-domain EEG signals, the type of visual input during classifier training affects the performance and has to be considered in future studies.
Original language | English |
---|---|
Publication status | Published - 18 Sept 2017 |
Event | 7th Graz BCI Conference 2017: From Vision to Reality - Graz, Austria Duration: 18 Sept 2017 → 22 Sept 2017 |
Conference
Conference | 7th Graz BCI Conference 2017 |
---|---|
Country/Territory | Austria |
City | Graz |
Period | 18/09/17 → 22/09/17 |
Fields of Expertise
- Human- & Biotechnology
Treatment code (Nähere Zuordnung)
- Basic - Fundamental (Grundlagenforschung)