Yichen Wang

Yichen Wang
PhD Student, PhD Program in Computational Neuroscience
Grossman Center for Quantitative Biology and Human Behavior
Department of Neurobiology
The University of Chicago
5812 South Ellis Ave. MC 0912, Suite P-400, Chicago, IL 60637
ude [tod] ogacihcu [ta] wnehciy

I study how cellular biophysics constrain correlation and variability in neuronal networks, advised by Dr. Brent Doiron. During undergrad, I worked at the Flatiron Institute mentored by Dr. Chi-Ning Chou and Dr. SueYeon Chung. I received my B.S. degrees in Physics and in Mathematics with a Specialization in Computing from UCLA.

publicationsposterscode  |  Google Scholar · GitHub · X · Bluesky · LinkedIn

Publications (also on Google Scholar)

Chi-Ning Chou*, Hang Le*, Yichen Wang & SueYeon Chung (2025). Feature Learning beyond the Lazy-Rich Dichotomy: Insights from Representational Geometry. International Conference on Machine Learning (ICML) 2025. Spotlight Presentation. arXiv | conference | abstract | bibtex ]

Integrating task-relevant information into neural representations is a fundamental ability of both biological and artificial intelligence systems. Recent theories have categorized learning into two regimes: the rich regime, where neural networks actively learn task-relevant features, and the lazy regime, where networks behave like random feature models. Yet this simple lazy-rich dichotomy overlooks a diverse underlying taxonomy of feature learning, shaped by differences in learning algorithms, network architectures, and data properties. To address this gap, we introduce an analysis framework to study feature learning via the geometry of neural representations. Rather than inspecting individual learned features, we characterize how task-relevant representational manifolds evolve throughout the learning process. We show, in both theoretical and empirical settings, that as networks learn features, task-relevant manifolds untangle, with changes in manifold geometry revealing distinct learning stages and strategies beyond the lazy-rich dichotomy. This framework provides novel insights into feature learning across neuroscience and machine learning, shedding light on structural inductive biases in neural circuits and the mechanisms underlying out-of-distribution generalization.
@inproceedings{chou2025feature,
  title={Feature Learning beyond the Lazy-Rich Dichotomy: Insights from Representational Geometry},
  author={Chou, Chi-Ning and Le, Hang and Wang, Yichen and Chung, SueYeon},
  booktitle={International Conference on Machine Learning (ICML)},
  year={2025}
}
Copied to clipboard!

Posters

Chi-Ning Chou*, Hang Le*, Yichen Wang & SueYeon Chung (2024). Understanding Feature Learning in Neural Networks via Manifold Capacity and Effective Geometry. Cognitive and Computational Neuroscience Conference (CCN) 2024. pdf ]

Yichen Wang, Chi-Ning Chou, Artem Kirsanov & SueYeon Chung (2024). Studying How Neural Networks Learn Computationally Hierarchical Representations Using Manifold Capacity. Flatiron Institute Summer Poster Session 2024. pdf ]

Yichen Wang, Chi-Ning Chou & SueYeon Chung (2023). Exploring the Geometry of Neural Manifolds with Latent Structures. Flatiron Institute Summer Poster Session 2023. pdf ]

Y. Wang, S. Brosler, P. Ahmadi, T. Kuliński, S. Sarkar, T. Suresh & H. Yan (2022). Frequency specific features representing categorical face perception. Neuromatch Academy Computational Neuroscience Summer Program 2022.

Yichen Wang, Gabriel Ordonez, David Murillo, Alejandro Gutierrez, Cristian Chavez, Diego Espino, Mira Khosla, Kyle Tsujimoto, Aaron Blaisdell & Katsushi Arisaka (2022). Correlation between alpha wave frequency and eccentricity dependent reaction time. UCLA Undergraduate Research Week 2022. pdf ]

Code

See GitHub for projects and code.