Your Search Results

Use this resource - and many more! - in your textbook!

AcademicPub holds over eight million pieces of educational content for you to mix-and-match your way.

Experience the freedom of customizing your course pack with AcademicPub!
Not an educator but still interested in using this content? No problem! Visit our provider's page to contact the publisher and get permission directly.

Towards a Brain-Computer Interface for Dexterous Control of a Multi-Fingered Prosthetic Hand

By: Acharya, S.; Thakor, N.V.; Schieber, M.H.; Etienne-Cummings, R.; Hyun-Chool Shin; Tenore, F.; Aggarwal, V.;

2007 / IEEE / 1-4244-0791-5


This item was taken from the IEEE Conference ' Towards a Brain-Computer Interface for Dexterous Control of a Multi-Fingered Prosthetic Hand ' Recent advances in Brain-Computer Interfaces (BCI) have enabled direct neural control of robotic and prosthetic devices. However, it remains unknown whether cortical signals can be decoded in real-time to replicate dexterous movements of individual fingers and the wrist. In this study, single unit activity from 115 task-related neurons in the primary motor cortex (M1) of a trained rhesus monkey were recorded, as it performed individuated movements of the fingers and wrist of the right hand. Virtual multi-unit ensembles, or voxels, were created by randomly selecting contiguous subpopulations of these neurons. Non-linear hierarchical filters using Artificial Neural Networks (ANNs) were designed to asynchronously decode the activity from multiple virtual ensembles, in real-time. The decoded output was then used to actuate individual fingers of a robotic hand. An average real-time decoding accuracy of greater than 95% was achieved with all neurons from randomly placed voxels containing 48 neurons, and up to 80% with as few as 25 neurons. These results suggest that dexterous control of individual digits and wrist of a prosthetic hand can be achieved by real-time decoding of neuronal ensembles from the M1 hand area in primates.