A summary of recent research projects at the intersection of biology, learning, and computation.

In-context denoising with one-layer transformers

In-context learning refers to the ability of models to generalize from examples presented only at inference time — without parameter updates. We study this behavior through the lens of associative memory, where a system retrieves stored patterns based on partial input.

ICML 2025 - Selected for Spotlight Presentation
We introduce in-context denoising as a probe task to analyze how attention heads process information. We connect this behavior to associative memory (Hopfield networks and their modern, dense variants), with the trained attention mechanism acting as a single gradient step in a context-dependent associative memory landscape.


Minimal motifs for habituating systems

Habituation is a basic form of non-associative learning that enables organisms to suppress repetitive stimuli — yet it lacks a broadly accepted mathematical framework.

PNAS, 2024
We characterize this phenomenon and its various hallmarks mathematically. We identify the simplest dynamical circuits (state-space models) that can facilitate this capability.

IEEE Conference on Decision and Control (CDC), 2024
We demonstrate that this elementary form of learning, observed across living systems, can be implemented in a simple analog (electrical) circuit.


Growing networks of coupled oscillators

In biological systems, cellular division is governed by internal oscillations and modulated by interactions with neighboring cells. Motivated by this process, we propose a new class of growing dynamical systems where individual oscillators replicate upon completing a cycle, recursively shaping the collective network.

Development, 2023
We introduce a minimal model in which coupled relaxation oscillators divide into two with each full cycle, generating structured networks of interacting oscillators. Despite having only three tunable parameters, the system reproduces the majority of germline cyst structures observed across invertebrates, offering a dynamical systems perspective on multicellular self-organization.


Collective gene-expression patterns in multicellular systems

How do multicellular systems, which are made up of genetically identical cells, achieve and maintain diverse patterned states while remaining responsive to intercellular and environmental signals?

Cell Reports Physical Science, 2023
We develop a model of interacting cell types based on gene regulatory dynamics and intercellular communication. Cells are represented as associative memory networks, and tissues as graphs of identical cells. The model reveals self-organized behaviors — including multi-stability of tissue-level attractors — emerging from internal gene regulation and cell-cell interactions.


Connecting Hopfield networks to Restricted Boltzmann Machines

ICLR 2021 - Selected for Oral Presentation (top 2%)
We formally show that Hopfield networks with correlated patterns can be mapped to Restricted Boltzmann Machines (RBMs) with orthogonal weights. Experiments on the MNIST dataset demonstrate that this mapping provides a useful deterministic initialization to the RBM weights. Furthermore, the connection to interpretable Hopfield networks offers a lens for understanding the learning dynamics of RBMs.