EEML'25, our yearly machine learning summer school event, will be organised next summer in the beautiful city of Sarajevo - the place where East meets West π§π¦π§π¦π§π¦.
More details coming soon, please see the link in the thread!
EEML'25, our yearly machine learning summer school event, will be organised next summer in the beautiful city of Sarajevo - the place where East meets West π§π¦π§π¦π§π¦.
More details coming soon, please see the link in the thread!
Catch my poster tomorrow at the NeurIPS MLSB Workshop! We present a simple (yet effective π) multimodal Transformer for molecules, supporting multiple 3D conformations & showing promise for transfer learning.
Interested in molecular representation learning? Letβs chat π!
* in the afternoon session π
Happening today, East Exhibit Hall A-C, poster #3110. Come say "Hi!"! π
6/6 Interested in learning more? Check out our preprint here: arxiv.org/pdf/2405.17311.
If youβd like to discuss, Iβd be very happy to chat during the poster session in Vancouver! :)
5/6 How it works: Probabilistic sampling connects original nodes to virtual ones, enhancing connectivity without explicit pairwise computations. The result is a framework that achieves both higher WL expressiveness and efficiency in graph-based learning.
4/6 We demonstrate SOTA results on various benchmarks, effectively addressing over-squashing and under-reaching. IPR-MPNNs also surpass standard MPNNs in expressiveness, distinguishing complex graph structuresβall while being faster and more memory-efficient than GTs. π
3/6 Enter IPR-MPNNs: Our approach learns to rewire graphs probabilistically by adding virtual nodes. This eliminates the need for heuristics, making the method more flexible and task-adaptive, while maintaining computational efficiency. π―
2/6 Standard MPNNs struggle with long-range interactions, making them less effective for large, complex graphs. Transformers help but come with quadratic complexity, which is computationally expensive. Rewiring heuristics? Often brittle and task-specific.
1/6 We're excited to share our #NeurIPS2024 paper: Probabilistic Graph Rewiring via Virtual Nodes! It addresses key challenges in GNNs, such as over-squashing and under-reaching, while reducing reliance on heuristic rewiring. w/ Chendi Qian, @christophermorris.bsky.social @mniepert.bsky.social π§΅
Genuine question - why do captions go above tables? I've always assumed that this is due to wanting to make tables distinct from figures, but it seems like a convention.
ππ