Fig. 1: Drosophila-inspired two-layer ReSU neural network trained in the self-supervised setting on translating natural images …
Cool to see a concrete alternative to #ReLU #backprop models that remains both interpretable and biologically grounded:
Qin et al. introduce #ReSU, Rectified Spectral Units, as a replacement for ReLU.
📄 https://arxiv.org/abs/2512.23146
#CompNeuro #NeuroAI #neuroscience 🧪