Special Section on Clinical Near-Infrared Spectroscopy and Imaging

Toward more intuitive brain–computer interfacing: classification of binary covert intentions using functional near-infrared spectroscopy

[+] Author Affiliations
Han-Jeong Hwang

Kumoh National Institute of Technology, Department of Medical IT Convergence Engineering, 61 Daehak-ro, Gumi, Gyeongbuk 730-701, Republic of Korea

Han Choi, Jeong-Youn Kim, Won-Du Chang, Chang-Hwan Im

Hanyang University, Department of Biomedical Engineering, 222 Wangsimni-ro, Seongdong-gu, Seoul 04763, Republic of Korea

Do-Won Kim

Hanyang University, Department of Biomedical Engineering, 222 Wangsimni-ro, Seongdong-gu, Seoul 04763, Republic of Korea

Berlin Institute of Technology, Machine Learning Group, Marchstraße 23, 10587 Berlin, Germany

Kiwoong Kim

Korea Research Institute of Standard and Science, 267 Gajeong-ro, Yuseong-gu, Daejeon 34113, Republic of Korea

Sungho Jo

Korea Advanced Institute of Science and Technology, Department of Computer Science, Daehak-ro, Yuseong-gu, Daejeon 34141, Republic of Korea

J. Biomed. Opt. 21(9), 091303 (Apr 05, 2016). doi:10.1117/1.JBO.21.9.091303
History: Received November 5, 2015; Accepted March 7, 2016
Text Size: A A A

Abstract.  In traditional brain–computer interface (BCI) studies, binary communication systems have generally been implemented using two mental tasks arbitrarily assigned to “yes” or “no” intentions (e.g., mental arithmetic calculation for “yes”). A recent pilot study performed with one paralyzed patient showed the possibility of a more intuitive paradigm for binary BCI communications, in which the patient’s internal yes/no intentions were directly decoded from functional near-infrared spectroscopy (fNIRS). We investigated whether such an “fNIRS-based direct intention decoding” paradigm can be reliably used for practical BCI communications. Eight healthy subjects participated in this study, and each participant was administered 70 disjunctive questions. Brain hemodynamic responses were recorded using a multichannel fNIRS device, while the participants were internally expressing “yes” or “no” intentions to each question. Different feature types, feature numbers, and time window sizes were tested to investigate optimal conditions for classifying the internal binary intentions. About 75% of the answers were correctly classified when the individual best feature set was employed (75.89% ±1.39 and 74.08% ±2.87 for oxygenated and deoxygenated hemoglobin responses, respectively), which was significantly higher than a random chance level (68.57% for p<0.001). The kurtosis feature showed the highest mean classification accuracy among all feature types. The grand-averaged hemodynamic responses showed that wide brain regions are associated with the processing of binary implicit intentions. Our experimental results demonstrated that direct decoding of internal binary intention has the potential to be used for implementing more intuitive and user-friendly communication systems for patients with motor disabilities.

Figures in this Article
© 2016 Society of Photo-Optical Instrumentation Engineers

Citation

Han-Jeong Hwang ; Han Choi ; Jeong-Youn Kim ; Won-Du Chang ; Do-Won Kim, et al.
"Toward more intuitive brain–computer interfacing: classification of binary covert intentions using functional near-infrared spectroscopy", J. Biomed. Opt. 21(9), 091303 (Apr 05, 2016). ; http://dx.doi.org/10.1117/1.JBO.21.9.091303


Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

PubMed Articles
Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.