Skip to main content Skip to main navigation

Publication

Navigating a Smart Wheelchair with a Brain-Computer Interface Interpreting Steady-State Visual Evoked Potentials

Christian Mandel; Thorsten Lüth; Tim Laue; Thomas Röfer; Axel Gräser; Bernd Krieg-Brückner
In: Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS-2009), October 11 - September 15, St. Louis, Missouri, USA, IEEE Xplore, 2009.

Abstract

In order to allow severely disabled people who cannot move their arms and legs to steer an automated wheelchair, this work proposes the combination of a non-invasive EEG-based human-robot interface and an autonomous navigation system that safely executes the issued commands. The robust classification of steady-state visual evoked potentials in brain activity allows for the seamless projection of qualitative directional navigation commands onto a frequently updated route graph representation of the environment. The deduced metrical target locations are navigated to by the application of an extended version of the well-established Nearness Diagram Navigation method. The applicability of the system proposed is demonstrated by a real-world pilot study in which nine untrained subjects successfully navigated an automated wheelchair, requiring only some ten minutes of preparation.