학술논문

Unsupervised adaptive transfer learning for Steady-State Visual Evoked Potential brain-computer interfaces
Document Type
Conference
Source
2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC) Systems, Man, and Cybernetics (SMC), 2016 IEEE International Conference on. :004135-004140 Oct, 2016
Subject
Computing and Processing
Robotics and Control Systems
Transportation
Electroencephalography
Correlation
Training data
Calibration
Standards
Visualization
Conferences
Language
Abstract
Recent advances in signal processing for the detection of Steady-State Visual Evoked Potentials (SSVEPs) have moved away from traditionally calibrationless methods, such as canonical correlation analysis, and towards algorithms that require substantial training data. In general, this has improved detection rates, but SSVEP-based brain-computer interfaces (BCIs) now suffer from the requirement of costly calibration sessions. Here, we address this issue by applying transfer learning techniques to SSVEP detection. Our novel Adaptive-C3A method incorporates an unsupervised adaptation algorithm that requires no calibration data. Our approach learns SSVEP templates for the target user and provides robust class separation in feature space leading to increased classification accuracy. Our method achieves significant improvements in performance over a standard CCA method as well as a transfer variant of the state-of-the art Combined-CCA method for calibrationless SSVEP detection.