Primate electrophysiology and decoding of eye-movement planning

Eye movement brain-computer interface (BCI)

This project is focused on developing the first eye movement brain-computer interface (BCI) by using a novel hardware configuration and software interface that rely on neural activity from the monkey eye movement system.

Our BCI decodes intended eye movement direction from the neural activities recorded from three 32-channel planar electrode arrays that were permanently implanted in the supplementary eye field (SEF), frontal eye field (FEF), and prefrontal cortex (PFC) of a non-human primate. Currently, we are able to anticipate the monkey’s actual eye movements by predicting the intended saccades to spatial locations on a computer screen. In other words; we are able to successfully produce “virtual eye movements” to targets on a computer screen.

The next steps in this project are:

  • To promote more learnability of the decoder by investigating the effects of providing richer feedback to the monkey by means of an enhanced feedback paradigm (based on an analog representation of the decoded saccade direction; opposite to the current discrete target representation) and a continuous feedback paradigm (in which the decoded target will be displayed continuously throughout the delay period; based on a real-time decoder.)
  • Address compatibility with the Unlock Project software framework by implementing a task that more closely mimics the “up, down, left, right” control scheme used in the Unlock Project BMI.

The full compatibility of the eye movement BMI with the Unlock Project software will provide a faster, more accurate option to our current EEG-based system for locked-in patients who are willing to undergo neurosurgery for implantation.

Main personnel Misha Panko, Scott Brincatt (MIT), Nan Jia Funding:
Collaborators Earl Miller (MIT), Frank Guenther, Jon Brumberg CELEST: Developing new microelectrodes and analysis techniques for use in Brain Machine Interface applications