The goal of research in the NT lab is to understand the contributions of sensory neuron activity to perception and behavior. The underlying premise of our research is that such understanding necessarily involves interpreting large-scale recordings in sufficiently complex (“natural”) experimental contexts, ideally that engage goal-directed behavior. As a result, we specialize in statistical modeling, which is a means to directly test hypotheses about neural computations and physiological elements of the system using recorded neural activity. The scope of our work has dramatically expanded in the last few years enabled to technological advances in both recording technology (large-scale physiological recordings) and machine learning tools. With our novel approaches now largely established, we have been working with the most promising experimental collaborations that could yield insights into fundamental aspects of sensory (and sensorimotor) brain function that have been largely inaccessible.
Current projects
-
Cortical processing of vision
Projects studying representations and computations in the primary visual cortex, currently with Bevil Conway (NEI) and several past collaborators -
Natural vision as a sensorimotor process
Projects with Jake Yakes (UC Berkeley) and Jude Mitchell (U Rochester), and Farran Briggs (NEI) -
Contextual modulation of sensory processing
Projects with Hendrikje Nienborg (NEI), and Alex Huk (UCLA)
Computational methods
- Statistical models
Building predictive computatonal models of recorded neurons and neural populations. -
NeuroAI: interpreting deep neural networks through the lens of biological processing
Deep neural networks (DNNs) are able to solve visual tasks through solutions that resemble the biological brain. Without new principles to interpret deep neural networks, we are not able to understand either biological or artificial computation. Our approach is to constrain artificial systems to resemble biological computations, while leveraging the access gained by having a "digitial twin" that can be studied in silico. As a result, projects in this area are not separate from the experiment-based projects described above.
Previous projects
-
Computation pre-cortical vision and its underlying circuits (retina and LGN)
Projects with Josh Singer (UMD) and Jonathan Demb (Yale). -
Sensorimotor processing of touch
Project with Scott Pluta (Perdue U)