Abstract

Brain–Computer Interfaces (BCI) rely on the interpretation of brain activity to provide people with disabilities with an alternative/augmentative interaction path. In light of this, BCI could be considered as enabling technology in many fields, including Active and Assisted Living (AAL) systems control. Interaction barriers could be removed indeed, enabling user with severe motor impairments to gain control over a wide range of AAL features. In this paper, a cost-effective BCI solution, targeted (but not limited) to AAL system control is presented. A custom hardware module is briefly reviewed, while signal processing techniques are covered in more depth. Steady-state visual evoked potentials (SSVEP) are exploited in this work as operating BCI protocol. In contrast with most common SSVEP-BCI approaches, we propose the definition of a prediction confidence indicator, which is shown to improve overall classification accuracy. The confidence indicator is derived without any subject-specific approach and is stable across users: it can thus be defined once and then shared between different persons. This allows some kind of Plug&Play interaction. Furthermore, by modelling rest/idle periods with the confidence indicator, it is possible to detect active control periods and separate them from “background activity”: this is capital for real-time, self-paced operation. Finally, the indicator also allows to dynamically choose the most appropriate observation window length, improving system’s responsiveness and user’s comfort. Good results are achieved under such operating conditions, achieving, for instance, a false positive rate of 0.16 min−1, which outperform current literature findings.

Details

Actions