Can we use muscle movement commands (tasks) from one animal’s brain to elicit goal-directed limb movement in another animal? If so, would that mean that people who have damaged their spinal cord (with no information flowing from the brain to the body) can use such neural prosthesis to execute normal activities? A study suggest that we are at the cusp of a revolution in brain- machine interfaces.
Harvard researchers Shanechi MM, Hu RC and Williams ZM used two adult Rhesus monkeys (Macaca mulatta) who were designated as a master and an avatar. The monkey functioning as the master was responsible for controlling movement based on cortically recorded neural activities, and the other sedated monkey functioned as the avatar and was responsible for generating movement based on distal spinal cord and/or muscle stimulations.The conscious monkey (master monkey) had an implanted brain chip, and the avatar had spinal implants (36 electrodes) . The master had a brain chip implanted that could monitor the activity of up to 100 neurons. During training, the physical actions of the monkey were matched up with the patterns of electrical activity in the neurons. The decoded activities of premotor populations ( which control intent to move) and their adaptive responses were used to effectively direct the avatar’s limb to distinct targets variably displayed on a screen.The sedated monkey held a joystick while the master thought about moving a cursor up and down.In 98 per cent of tests, the master could correctly control the avatar’s arm.
During each session, the master was seated in a primate chair placed within a radiofrequency shielded recording enclosure. Simultaneous multiple-unit recordings were made from the master’s premotor cortex using chronically implanted planar multielectrode arrays . Signals were digitized and processed to extract action potentials in real-time.The avatar was fully sedated (using a combination of ketamine, xylazine and atropine) and was seated in a separate enclosure. The avatar’s limb was attached to a planar, free range-of-motion, spring-loaded joystick that controlled a cursor displayed on the master’s screen.All trials during the task began with presentation of a small circular target that was positioned at two random locations on the screen. The master had to reach the displayed target by directing a cursor, from the centre of the screen to the displayed target, using stimulation-elicited limb movement in the avatar.
These are simple movements and control and execution of natural movements are much more complex. . Moving a cursor up and down is a long way from the dextrous movement needed for daily activities.However,this proof of concept study show that it is theoretically possible.
Interesting works in this field :
Pais-Vieira, M., Lebedev, M., Kunicki, C., Wang, J. & Nicolelis, M. A. A brain- to-brain interface for real-time sharing of sensorimotor information. Sci. Rep. 3, 1319 (2013).
S. S., Kim, H., Filandrianos, E., Taghados, S. J. & Park, S. Non-invasive brain-to-brain interface (BBI): establishing functional links between two brains. PLoS One 8, e60410 (2013).
Related Ted talk: Miguel Nicolelis: A monkey that controls a robot with its thoughts. No, really. (The brain neuronal activity in relation to the desire to move/ do tasks can be recorded and using computational micro engineering, this can be send anywhere ( new body i.e. robotic arms/ exoskeleton etc ) to execute the task).
Summary of the article:
Shanechi MM, Hu RC, Williams ZM. Nat Commun. 2014 Feb 18;5:3237