Rotskoff Group @ Stanford
I'm hiring a postdoc with a flexible start date (any time in 2026). Come work with us on topics at the interface of machine learning, biophysics, a nonequilibrium statistical mechanics. If interested, send me a CV and a short summary of why you think you'd be a good fit.
statmech.stanford.edu
17.12.2025 06:08
π 12
π 3
π¬ 0
π 0
Small angel X-ray scattering
19.06.2025 20:49
π 60
π 8
π¬ 0
π 0
Big fan of this perspective:
07.05.2025 18:46
π 44
π 8
π¬ 2
π 0
The plan at FutureHouse has been to build scientific agents for discoveries. Weβve spent the last year researching the best way to make agents. Weβve made a ton of progress and now weβve engineered them to be used at scale, by anyone. Free and on API.
01.05.2025 16:06
π 13
π 3
π¬ 1
π 2
What an incredibly cool paper! While knot theory strictly applies to closed curves, Tommy, @smnlssn.bsky.social , and @paulrobustelli.bsky.social show that writhe, a knot "non-invariant" that changes with smooth deformations, provides a meaningful descriptor for flexible conformations.
01.05.2025 05:49
π 7
π 2
π¬ 1
π 0
Figure from the paper illustrating sequenceβensembleβfunction relationships for disordered proteins. ML prediction (black) and design (orange) approaches are highlighted on the connecting arrows. Prediction of properties/functions from sequence (or vice versa, design) can include biophysics approaches via structural ensembles, or bioinformatics approaches via other hetero- geneous sources. The lower panels show examples of properties and functions of IDRs for predictions or design targets. ML, machine learning; IDRs, intrinsically disordered proteins and regions.
Our review on machine learning methods to study sequenceβensembleβfunction relationships in disordered proteins is now out in COSB
authors.elsevier.com/sd/article/S...
Led by @sobuelow.bsky.social and Giulio Tesei
12.03.2025 21:37
π 91
π 27
π¬ 0
π 1
Amazingly, this trick works. Due to improved algorithms for learning the score coming from the generative modeling, the applicability of this approach is very broad. I had spent several years making false starts on implementing the Malliavin calculus, but in the end, we found a route around it ;)
04.03.2025 18:45
π 1
π 0
π¬ 0
π 0
JΓ©rΓ©mie Klinger found a simple trick back to Girsanov: you take the perturbation in the diffusion at the level of the Fokker-Planck equation and rewrite it to be included in the drift. The resulting drift then has a term proportional to \nabla \log \rho(x,t), what machine learners call the score!
04.03.2025 18:45
π 1
π 0
π¬ 1
π 0
The βclassicalβ strategy for diffusion sensitivities comes from financial mathematics and is called the Malliavin calculus, itβs very explicit for simple models like Black Scholes but for a general Langevin equation, it is no easy feat to compute the sensitivity.
04.03.2025 18:45
π 0
π 0
π¬ 1
π 0
In many biological and active systems, diffusivity is highly spatially dependent, and the theory for perturbations in such cases is rather limited, largely based on beautiful work by Leticia Cugliandolo and work by Falasco and Baeisi, among many others.
04.03.2025 18:45
π 0
π 0
π¬ 1
π 0
Far from equilibrium, it is not so easy: one needs to understand the dynamics, and this requires working with dynamical trajectories and their associated path measures. Classically, we do this using the Girsanov theorem, which constructs a βrelative path measureβ as we perturb the drift term.
04.03.2025 18:45
π 0
π 0
π¬ 1
π 0
Computing response functions or βsensitivesβ requires understanding how an external perturbation drives the change in some observable. For equilibrium systems, Onsager taught us that this can be understood with correlation functions.
04.03.2025 18:45
π 0
π 0
π¬ 1
π 0
Excited to see our paper βComputing Nonequilibrium Responses with Score-Shifted Stochastic Differential Equationsβ in Physical Review Letters this morning as an Editorβs Suggestion! We uses ideas from generative modeling to unravel a rather technical problem. π§΅ journals.aps.org/prl/abstract...
04.03.2025 18:45
π 10
π 2
π¬ 1
π 0
Applications for the FutureHouse Independent Postdoctoral Fellowship are due in two weeks! $125k annual stipend, full access to our resources, be coadvised by world class professors and apply our AI science agents to make new discoveries. Apply!
Details here: www.futurehouse.org/fellowship
31.01.2025 15:08
π 8
π 3
π¬ 0
π 1
Ten simple rules for developing good reading habits during graduate school and beyond
To me, the most important are:
Read often, read broadly (incl. older papers and outside your field), and learn to read some papers in detail and others more superficially (and quickly)
26.01.2025 10:16
π 128
π 42
π¬ 5
π 4
I am hiring a postdoctoral scholar with a start date summer or fall 2025. Projects will be focused on thermodynamically consistent generative models, broadly defined. If youβre interested, please send a CV and one paragraph about why you think youβd be a good fit to rotskoff@stanford.edu
23.12.2024 17:31
π 47
π 21
π¬ 0
π 0
Lot of cool stuff in here. Consistent with my working hypothesis that the main scientific utility of LLMs at the moment is plain old NLP
22.12.2024 17:30
π 9
π 1
π¬ 0
π 0
Really cool opportunity via futurehouse. Come work with them and collaborate with us at Stanford!
19.12.2024 17:55
π 4
π 0
π¬ 0
π 0
If you didn't see our poster at NeurIPS on how to make diffusion model inference fast, you can always read the paper here: arxiv.org/abs/2405.15986
13.12.2024 16:22
π 10
π 0
π¬ 0
π 0
Can verify that the code works, too :)
07.12.2024 05:23
π 0
π 0
π¬ 0
π 0
@franknoe.bsky.social presented this very impressive work at a fantastic @cecamevents.bsky.social workshop this week. Iβm very excited to take a deep dive into the details this weekend!
06.12.2024 16:34
π 15
π 2
π¬ 0
π 0
NeurIPS Poster Accelerating Diffusion Models with Parallel Sampling: Inference at Sub-Linear Time ComplexityNeurIPS 2024
If you're at NeurIPS next week come see our spotlight poster led by Yinuo Ren and Haoxuan Chen! We use the parallel sampling technique to rigorously establish a big acceleration for diffusion model inference! neurips.cc/virtual/2024...
03.12.2024 21:55
π 9
π 2
π¬ 0
π 0
There's no easy way to do this in general, but computing the stationary distribution for a nonequilibrium dynamics might be a possible in some low dimensional system or systems with special structure ( ones where you can represent the distribution with tensor networks). Simulations otherwise...
02.12.2024 22:38
π 0
π 0
π¬ 1
π 0
Thanks for looping me in @erikhthiede.bsky.social If by order, you mean that there's some parameter that stabilizes to stationary or periodic steady state, then there the most general solution is simply solving for the stationary distribution or long time limit of the expectation of the order param
02.12.2024 22:38
π 1
π 0
π¬ 1
π 0
Accurate and Efficient Structure Elucidation from Routine One-Dimensional NMR Spectra Using Multitask Machine Learning
Rapid determination of molecular structures can greatly accelerate workflows across many chemical disciplines. However, elucidating structure using only one-dimensional (1D) NMR spectra, the most readily accessible data, remains an extremely challenging problem because of the combinatorial explosion of the number of possible molecules as the number of constituent atoms is increased. Here, we introduce a multitask machine learning framework that predicts the molecular structure (formula and connectivity) of an unknown compound solely based on its 1D 1H and/or 13C NMR spectra. First, we show how a transformer architecture can be constructed to efficiently solve the task, traditionally performed by chemists, of assembling large numbers of molecular fragments into molecular structures. Integrating this capability with a convolutional neural network, we build an end-to-end model for predicting structure from spectra that is fast and accurate. We demonstrate the effectiveness of this framework on molecules with up to 19 heavy (non-hydrogen) atoms, a size for which there are trillions of possible structures. Without relying on any prior chemical knowledge such as the molecular formula, we show that our approach predicts the exact molecule 69.6% of the time within the first 15 predictions, reducing the search space by up to 11 orders of magnitude.
Chemists use NMR spectroscopy to identify molecules, but interpreting spectra is laborious and error prone. We show the process can be automated end-to-end using a well-designed Molecular GPT. Importantly, we also make predictions of substructures for interpretability. pubs.acs.org/doi/10.1021/...
13.11.2024 18:22
π 12
π 3
π¬ 1
π 0