World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

Motor Intention Decoding from the Upper Limb by Graph Convolutional Network Based on Functional Connectivity

    https://doi.org/10.1142/S0129065721500477Cited by:22 (Source: Crossref)

    Decoding brain intention from noninvasively measured neural signals has recently been a hot topic in brain-computer interface (BCI). The motor commands about the movements of fine parts can increase the degrees of freedom under control and be applied to external equipment without stimulus. In the decoding process, the classifier is one of the key factors, and the graph information of the EEG was ignored by most researchers. In this paper, a graph convolutional network (GCN) based on functional connectivity was proposed to decode the motor intention of four fine parts movements (shoulder, elbow, wrist, hand). First, event-related desynchronization was analyzed to reveal the differences between the four classes. Second, functional connectivity was constructed by using synchronization likelihood (SL), phase-locking value (PLV), H index (H), mutual information (MI), and weighted phase-lag index (WPLI) to acquire the electrode pairs with a difference. Subsequently, a GCN and convolutional neural networks (CNN) were performed based on functional topological structures and time points, respectively. The results demonstrated that the proposed method achieved a decoding accuracy of up to 92.81% in the four-class task. Besides, the combination of GCN and functional connectivity can promote the development of BCI.