Haha thanks a lot!! :)
Haha thanks a lot!! :)
🙋♂️
Some limitations remain (mainly 10K rows x 500 cols limit and inference speed) but this new in-context learning paradigm will only get much faster & better from there! I've joined this great team at @prior_labs as founding researcher 🚀 Join us! www.notion.so/priorlabs/Pr...
Groundbreaking work, congrats to the team!! 🎉 When I started my PhD 3 years ago, our tabular benchmark showed tree-based models miles ahead of neural networks. On the same benchmark, TabPFN v2 now reaches in 10s what CatBoost achieves in 4h of tuning 🤯
Paper screenshot and Figure 1 (c) with cumulative ablations for components of RealMLP-TD.
Can deep learning finally compete with boosted trees on tabular data? 🌲
In our NeurIPS 2024 paper, we introduce RealMLP, a NN with improvements in all areas and meta-learned default parameters.
Some insights about RealMLP and other models on large benchmarks (>200 datasets): 🧵