I am interested in studying many different aspects of how we look and reach
so that we can build an optimal neural prosthetic. Specifically, my lab will study how certain areas of the brain encode movement goals to
arbitrary locations in space under complex conditions that mimic natural behaviour.
We will then attempt to decode these movement plans in real-time in order to develop devices that can be controlled by a variety of cognitive signals.
Some of current projects involve the following:
The Effects of body movements on the motor plan.
Even though eye movements are stereotyped and coupled to the hand
and that fixation tends to land on reach targets eye movements that affect the reach plan
are problematic for prosthetic systems. Remember we want to decode goals
and we have to make sure these goals are invariant to shifts in the reference frames.
We will use saccades and smooth pursuit eye movements to statically and dynamically
shift the location of the visual target during reaching. In order for updating to take place,
we predict that activity related to smooth pursuit eye movements will be found in brain areas
involved in spatially updating reach targets.
Amplitude encoding in the parietal cortex.
Whenever human or non-human primates plan a reach, the direction
and the amplitude of the reach need to be specified in the reach
plan to accurately reach targets at arbitrary locations. Both variables
need to be known to decode motor plans to arbitrary targets in space.
We have some preliminary data that shows that PRR does carry information
about the amplitude of a reach.
Effect of reward on reaches and its relation to plasticity.
Testing new arrays in rats.
Brain control trials under natural behaviour.
Effect of vestibular stimulation on motor plans.
Linearity of the vestibular system.