5/ The work is now available at:
π arXiv: arxiv.org/abs/2502.09445 #ICLR2025
π»GitHub: github.com/facebookrese...
5/ The work is now available at:
π arXiv: arxiv.org/abs/2502.09445 #ICLR2025
π»GitHub: github.com/facebookrese...
4/ On more complex machine-learning tasks, difFOCI performs favorably in terms of Worst group accuracy across several benchmarks:
3/ On standard classical feature selection tasks, difFOCI outperforms standard methods, selecting only a few informative, yet diverse features
2/ difFOCI is a plug-and-play differentiable relaxation of the Chatterjee's correlation coefficient, a popular, recently-proposed rank-based estimator. We show that difFOCI improves on numerous ML and AI applications, including domain shift, spurious correlations and fairness
1/ Happy to share my first accepted paper as a PhD student at Meta and ENS, Paris which I will present at @iclr-conf.bsky.social:
π Our work proposes difFOCI, a novel rank-based objective for β¨better feature learningβ¨
In collab with David Lopez-Paz, @giuliobiroli.bsky.social and Levent Sagun!