New Method Breaks Simplicity Bias in Recurrent Neural Networks
Researchers introduced Neural Similarity Deflation (INSD), a regularization that penalizes linear predictability of RNN activity, encouraging richer dynamics and achieving out‑of‑distribution performance. getnews.me/new-method-breaks-simpli... #insd #rnn