AI for Impact
Screen Shot 2017-08-29 at 9.45.42 AM.png

Deep Networks for Decoding Natural Images from Retinal Signals

Neural Networks for Efficient Bayesian Decoding of Natural Images from Retinal Neurons

NIPS 2017CCN 2017

Advisor: DR. Liam Paninski 

Nikhil Parthasarathy*, Eleanor Batty*, William Falcon, Thomas Rutten, Mohit Rajpal, E.J. Chichilnisky, Liam Paninski




Decoding sensory stimuli from neural signals can be used to reveal how we sense our physical environment, and is valuable for the design of brain-machine interfaces.  However, existing linear techniques for neural decoding may not fully reveal or exploit the fidelity of  the neural signal. Here we develop a new approximate Bayesian method for decoding natural images from the spiking activity of  populations of retinal ganglion cells (RGCs).

We sidestep known computational challenges with Bayesian inference by exploiting artificial neural networks developed for computer vision, enabling fast nonlinear decoding that incorporates natural scene statistics implicitly.  We use a decoder architecture that first linearly reconstructs an image from RGC spikes, then applies a convolutional autoencoder to enhance the image. The resulting decoder, trained on natural images and simulated neural responses, significantly outperforms linear decoding, as well as simple point-wise nonlinear decoding. 

These results provide a tool for the assessment and optimization of retinal prosthesis technologies, and reveal that the retina may provide a more accurate representation of the visual scene than previously appreciated.