Decoding Behavior Tasks From Brain Activity Using Deep Transfer Learning

Recently, advances in noninvasive detection techniques have shown that it is possible to decode visual information from measurable brain activities. However, these studies typically focused on the mapping between neural activities and visual information, such as the image or video stimulus, on the i...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 7; pp. 43222 - 43232
Main Authors: Gao, Yufei, Zhang, Yameng, Wang, Hailing, Guo, Xiaojuan, Zhang, Jiacai
Format: Journal Article
Language:English
Published: Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recently, advances in noninvasive detection techniques have shown that it is possible to decode visual information from measurable brain activities. However, these studies typically focused on the mapping between neural activities and visual information, such as the image or video stimulus, on the individual level. Here, the common decoding models across individuals that classifying behavior tasks from brain signals were investigated. We proposed a cross-subject decoding approach using deep transfer learning (DTL) to decipher the behavior tasks from functional magnetic resonance imaging (fMRI) recording during subjects performing different tasks. We connected parts of the state-of-the-art networks pre-trained on the ImageNet dataset to our defined adaption layers to classify the behavior tasks from fMRI data. Our experiments on the Human Connectome Project (HCP) dataset showed that the proposed method achieved a higher decoding accuracy across subjects than the previous studies. We also conducted an experiment on five subsets of HCP data, which further demonstrated that our DTL approach is more effective on small dataset than the traditional methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2907040