Neural Circuits & Dynamics
Applying information theory and statistical physics to understand how the brain encodes, compresses, and communicates information.
How much information does a population of neurons actually encode? How does the brain’s wiring shape what it can represent? My work aims at bringing the theoretical machinery developed for machine learning to bear on these questions in real biological circuits.
Current projects focus on characterizing the low-dimensional structure of sensory neural codes, understanding how functional connectivity evolves in response to sensory context, and connecting macroscopic signatures (power laws, criticality) to the underlying computational principles.
Key results: Characterizing measurement effects on critical scaling signatures in neural data (Frontiers in Computational Neuroscience 2025); power-law structure in empirical eigenvalue spectra of neural recordings (Entropy 2026); a nonlinear information-theoretic characterization of the dimensionality of visual neural codes.