you need to think with AI, the future is text
you need to think with AI, the future is text
my long bet to not bother to learn vim commands has finally paid off #sloperator
thank you adrian, thatβs a very poignant quote π
congrats to you two! well deserved!
i may be biased but this is a very nice example of a universal MLIP paper
those are the shiniest potatoes iβve ever seen
reality is proving @adrianhill.de more right by the minute -- now all reviews have been reverted to their pre-discussion state. someone remind me why i spent a week+ preparing rebuttals and reading author comments?
ICLR2026 is truly the gift that keeps on giving
When it comes to drama, ICLR 2026 is truly the gift that keeps on giving.
a meme with scilla and cariddi and ulysses ship trying to sail through, making fun of the peer review process at ICLR
made by a friend of a colleague. iykyk
seems like i was the only ICLR reviewer that bothered to declare LLM usage
This also shows that @typst.app is already a viable alternative to LaTeX for top conferences in ML!
Our paper on XAI + Automatic Differentiation got accepted at #NeurIPS2025. Can't wait to share the camera-ready soon! π₯³
Orbits for a periodic 3-body system, showing the stability of a ML long-time integrator
If you are excited about 30x longer time steps in molecular dynamics using FlashMD, but are worried about it not being symplectic, Filippo has something new cooking that should make you even more excited. Head to the #arxiv for a preview arxiv.org/html/2508.01...
for the jax kids, i just made my personal training infra repo public: github.com/sirmarcel/ma... -- happy for some feedback/ideas/takes!
I benefited massively from www.ipam.ucla.edu/programs/lon.... I got into ML for science through that program. Now IPAM may be gone mathstodon.xyz/@tao/1149568...
This, if true, is incredible. A long program in 2011 at IPAM that I attended practically wrote the book on machine learning in materials science, which is now AI. Not funding IPAM would be deeply destructive to US leadership in critical future technologies! Hard to fathom.
Absolute insanity. IPAM is where I first heard about equivariant NNs! What a loss.
this is work at @labcosmo.bsky.social with Egor, Tulga, Philip & @micheleceriotti.bsky.social!
... overall, it seems that we need more challenging benchmarks. or maybe LR behaviour is simply "not that complicated". we will see. the manuscript is here: arxiv.org/abs/2507.19382 & code & data are coming out soon-ish, once i run a few more ablations. looking forward to feedback!
... and show that it works pretty well on a few long-range benchmark datasets. but we also find that many benchmark tasks are easily solved even by short-range message passing, so we run a few more targeted experiments to see where it fails & confirm that LOREM works for these ...
β¨preprint alert: "learning long-range representations w/ equivariant messages" in which we get into the fray of long-range MLIPs and propose to use equivariant charges w/ classical electrostatics as long-range MP mechanism. we design LOREM, an equivariant MLIP, around this ...
very proud of filippo
Very proud to send Filippo Bigi to Vancouver to give an oral presentation at @icmlconf.bsky.social about our investigation of the use of "dark-side forces" in atomistic simulations. The final version is here openreview.net/forum?id=OEl... and it's worth a read even if you already read the #preprint
So kudos to π§βπ Filippo and @marceldotsci.bsky.social bsky.app/profile/labc..., thanks to funders @erc.europa.eu @snf-fns.ch @cscsch.bsky.social and @nccr-marvel.bsky.social, and go to Filippo's talk if you are at #icml25, to see this nice mix of #compchem and #machinelearning!
obvious in hindsight but news to me: accessing np.load-ed arrays by index is orders of magnitude slower than forcing them to RAM. notes.marcel.science/2025/numpy-l...
ok then thanks
very annoying when papers donβt include raw data for plots