Have a personal or library account? Click to login
A face-machine interface utilizing EEG artifacts from a neuroheadset for simulated wheelchair control Cover

A face-machine interface utilizing EEG artifacts from a neuroheadset for simulated wheelchair control

Open Access
|Jul 2021

Abstract

Many people suffer from movement disabilities and would benefit from an assistive mobility device with practical control. This paper demonstrates a face-machine interface system that uses motion artifacts from electroencephalogram (EEG) signals for mobility enhancement in people with quadriplegia. We employed an Emotiv EPOC X neuroheadset to acquire EEG signals. With the proposed system, we verified the preprocessing approach, feature extraction algorithms, and control modalities. Incorporating eye winks and jaw movements, an average accuracy of 96.9% across four commands was achieved. Moreover, the online control results of a simulated power wheelchair showed high efficiency based on the time condition. The combination of winking and jaw chewing results in a steering time on the same order of magnitude as that of joystick-based control, but still about twice as long. We will further improve the efficiency and implement the proposed face-machine interface system for a real-power wheelchair.

Language: English
Page range: 1 - 10
Submitted on: Apr 15, 2021
Published on: Jul 28, 2021
Published by: Professor Subhas Chandra Mukhopadhyay
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2021 Theerat Saichoo, Poonpong Boonbrahm, Yunyong Punsawad, published by Professor Subhas Chandra Mukhopadhyay
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.