Sarthak Chandra's Avatar

Sarthak Chandra

@sarthakc

Interested in neuroscience, development and dynamical systems | Faculty member @ ICTS | Previously: @MIT, @UMD, @IITK

132
Followers
51
Following
25
Posts
14.01.2025
Joined
Posts Following

Latest posts by Sarthak Chandra @sarthakc

Post image

How do brain areas control each other? πŸ§ πŸŽ›οΈ

✨In our NeurIPS 2025 Spotlight paper, we introduce a data-driven framework to answer this question using deep learning, nonlinear control, and differential geometry.πŸ§΅β¬‡οΈ

26.11.2025 19:32 πŸ‘ 89 πŸ” 30 πŸ’¬ 1 πŸ“Œ 3
Post image

Our next paper on comparing dynamical systems (with special interest to artificial and biological neural networks) is out!! Joint work with @annhuang42.bsky.social , as well as @satpreetsingh.bsky.social , @leokoz8.bsky.social , Ila Fiete, and @kanakarajanphd.bsky.social : arxiv.org/pdf/2510.25943

10.11.2025 16:16 πŸ‘ 70 πŸ” 24 πŸ’¬ 4 πŸ“Œ 5
Preview
Closed-loop modulation of remote hippocampal representations with neurofeedback Animal models of memory retrieval trigger retrieval with cues and measure retrieval using behavior. Coulter etΒ al. developed a neurofeedback paradigm that rewards hippocampal activity patterns associa...

Two cool papers out in Neuron today on memory.

1. Animals can be trained to activate specific remote memories at will:

www.cell.com/neuron/fullt...

2. Interactions between cortex and amygdala during reactivation in NREM sleep help enhance perceptual memories:

www.cell.com/neuron/fullt...

πŸ§ πŸ“ˆ πŸ§ͺ

20.03.2025 15:12 πŸ‘ 86 πŸ” 28 πŸ’¬ 2 πŸ“Œ 0

8/8
TL;DR: Peak Selection is a novel mechanism for the modularity emergence in a variety of systems. Applied to grid cells, it makes testable predictions at molecular, circuit, and functional levels, and matches observed period ratios better than any existing model!

19.02.2025 23:22 πŸ‘ 2 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

7/
Peak Selection applies broadly for module emergence:
The same mechanism can also explain:
🌱 Emergent ecological niches
🐠 Coral spawning synchrony
πŸ€– Modularity in optimization & learning

19.02.2025 23:22 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

6b/ (cont'd)
Central results and predictions:
β€’Self-scaling with organism size
β€’Topologically robust: insensitive to almost all param variations, activity perturbations; also robust to weight heterogeneity! (no need for exactly symmetric interactions in CANs)

19.02.2025 23:22 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

6/
Central results and predictions:
β€’Nearly **any** interaction shape can form grid cell patterning (Mexican-hat kernels not needed!)
β€’Grid cells involve two scales of interactions, one spatially varying and one fixed.
β€’Functional modularity can emerge without molecular modularity.

19.02.2025 23:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

5b/ (cont’d)
Grid modularity from Peak Selection!
β€’Discrete jumps in grid period without discrete precursors.
β€’Novel period ratio prediction: adjacent periods ratios vary as ratio of integers (3/2, 4/3, 5/4, …).
β€’Excellent agreement with data (R^2 ~0.99)!

19.02.2025 23:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

5/
Grid modularity from Peak Selection!
β€’Two forms of local interactions: one spatially varying smoothly in scale, the other held fixed.
β€’These spontaneously generate local patterning and global modularity!

19.02.2025 23:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

4/
2 classic ideas for structure emergence in biology
β€’Positional hypothesis: genes apply discrete thresholds, but discrete gene expression
β€’Turing hypothesis: Local interactions drive patterns, but single scale
But grid modules are multiscale, from presumably continuous precursors

19.02.2025 23:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

3/
Various measured cellular and circuit properties vary smoothly across grid cells. Yet, grid cells are organized into discrete modules with different spatial periods. How does discrete organization arise from smooth gradients?

19.02.2025 23:21 πŸ‘ 1 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Post image

2/ The work introduces β€œPeak Selection”: a general mechanism by which local interactions and smooth gradients give rise to global modules. We first focus on a classic example of modularity, grid cells in the brain.

19.02.2025 23:21 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
Global modules robustly emerge from local interactions and smooth gradients - Nature The principle of peak selection is described, by which local interactions and smooth gradients drive self-organization of discrete global modules.

1/ Our paper appeared in @Nature today! www.nature.com/articles/s41... w/ Fiete Lab and @khonamikail.bsky.social .
Explains emergence of multiple grid cell modules, w/ excellent match to data! Novel mechanism for applying across vast systems from development to ecosystems. πŸ§΅πŸ‘‡

19.02.2025 23:20 πŸ‘ 98 πŸ” 33 πŸ’¬ 2 πŸ“Œ 2

Thanks! Yes, in its current form it doesn't have recency or primacy effects. We have some thoughts on including recency with some weight decay to reduce the importance of older memories. But how to build in primacy and other forms of memory salience in this model is something to think more about!

18.01.2025 07:50 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

Thanks Sreeparna!

18.01.2025 07:45 πŸ‘ 1 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0
Preview
Episodic and associative memory from spatial scaffolds in the hippocampus - Nature A neocortical–entorhinal–hippocampal network model based on grid cell states recapitulates experimental results and reconciles the spatial, associative and episodic memory roles of the hippocampus.

10/10 πŸš€ TL;DR: VectorHaSH provides a unifying framework for efficient spatial, episodic, and associative memory in the hippocampus. Curious? Read the paper www.nature.com/articles/s41... #Memory #Neuroscience #Hippocampus

17.01.2025 00:10 πŸ‘ 9 πŸ” 0 πŸ’¬ 0 πŸ“Œ 0

9b/ (cont’d)
Hippocampal cells remap by direction/context πŸ“βž‘οΈβ¬…οΈ
Memory consolidation of multiple memory traces πŸ“š
Model thus bridges experiments and theory!

17.01.2025 00:10 πŸ‘ 4 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

9/ Experimental alignment: πŸ§ πŸ”¬
VectorHaSH mirrors entorhinal-hippocampal phenomena:
Grid cells demonstrate stable periodicity, rapid phase resets, robust velocity integration 🌐
Recreate correlation statistics of grid cells and place cells πŸ“Š

17.01.2025 00:10 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

8/ 🏰 Memory palaces explained!
Why does imagining a spatial walk supercharge memory?
VectorHaSH shows how recall of familiar locations acts as a secondary scaffold.
Result: Even approximate recall of locations reliably supports one-shot arbitrary, high-fidelity memories. πŸ’‘

17.01.2025 00:10 πŸ‘ 2 πŸ” 0 πŸ’¬ 2 πŸ“Œ 0

7/ How does VectorHaSH implement efficient episodic/sequence memory? Conventional models recall entire high-dim states ➑️ fail quickly. VectorHaSH reduces the problem to recalling low-dim velocity vectors on a scaffold. Result: Long sequences stored & recalled with precision! πŸ”₯

17.01.2025 00:09 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

6/ Spatial memory at scale? VectorHaSH links scaffold states to sensory cues via the hippocampus. This leads to independent non-interfering learned maps (landmark-location associations) in multiple rooms. Metric grid structure supports zero-shot inference along novel pathsπŸšΆβ€β™€οΈ

17.01.2025 00:09 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

5a/ (cont’d)
VectorHaSH then stores memories by heteroassociation of inputs with these scaffold states, enabling graceful degradation of memory detail with the number of stored memories over a vast number of inputs

17.01.2025 00:09 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

5/ Memory without cliffs? Hopfield and other models crash πŸ“‰after reaching capacity, completely losing all previous memories. VectorHaSH avoids this by first using grid cells to create a scaffold of exponentially many large-basin fixed points

17.01.2025 00:09 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

4a/ (cont’d)

(3) Episodic memory, using low-dimensional transitions in the grid space to support massive sequence capacity 🎞️(4) Method of Loci, explaining the paradox of why adding to the memory task (associating items with spatial locations) boosts performance 🏰

17.01.2025 00:09 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

4/
VectorHaSH supports: (1) Item memory, avoiding memory cliffs of Hopfield nets (2) Spatial memory, learning landmark-location associations over many maps 🌍& minimizing catastrophic forgetting

17.01.2025 00:09 πŸ‘ 2 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

3/ Key ideas πŸ”‘Hippocampal and grid cells create a fixed "scaffold" that serves as a robust, error-correcting memory foundation. External inputs are "hooked" onto the scaffold through heteroassociation. Low-dimensional transitions in grid space enable large sequence memory.

17.01.2025 00:08 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0

2/ Why are spatial & episodic memory co-localized in the hippocampus? How do memory palaces allow memorization of decks of cards?
Our model, VectorHaSH, shows how the hippocampus along with grid cells integrate these roles for memory storage, sequence recall, memory palaces 🏰

17.01.2025 00:08 πŸ‘ 3 πŸ” 0 πŸ’¬ 1 πŸ“Œ 0
Preview
Episodic and associative memory from spatial scaffolds in the hippocampus - Nature A neocortical–entorhinal–hippocampal network model based on grid cell states recapitulates experimental results and reconciles the spatial, associative and episodic memory roles of the hippocampus.

1/ Super-excited to share our new work β€œEpisodic and associative memory from spatial scaffolds in the hippocampus”, that just appeared in Nature! www.nature.com/articles/s41... Key insights and ideas πŸ‘‡#tweeprint

17.01.2025 00:08 πŸ‘ 74 πŸ” 28 πŸ’¬ 5 πŸ“Œ 3