すたぽら、初のホールライブで感動のフィナーレを迎える #すたぽら #VOISING #Relu
2.5次元アイドルグループ「すたぽら」がパシフィコ横浜で初のホールライブを開催。Reluの卒業を迎え、感動のステージは観客の心を掴んだ。
Latest posts tagged with #ReLU on Bluesky
すたぽら、初のホールライブで感動のフィナーレを迎える #すたぽら #VOISING #Relu
2.5次元アイドルグループ「すたぽら」がパシフィコ横浜で初のホールライブを開催。Reluの卒業を迎え、感動のステージは観客の心を掴んだ。
2.5次元アイドルグループ「すたぽら」が初のホールライブを成功させ感動の涙と共に卒業を迎える #すたぽら #StarLight_PolaRis #Relu
2.5次元アイドルグループ「すたぽら」が初のホールライブをパシフィコ横浜で開催し、感動的な卒業セレモニーも実施。約1万人を動員し、5年間の軌跡を刻んだ一日を詳報します。
If we apply #ReLU (Rectified Linear Unit) function (a common item used in #ArtificialIntelligencr, namely #NeuralNetworks) ...
Then it is clear #SamAltman has consistently increased #OpenAI's profit by a factor of 10 every year.
0 x 10¹⁰ = 00,000,000,000
Awesome consistent performance!
Fig. 1: Drosophila-inspired two-layer ReSU neural network trained in the self-supervised setting on translating natural images …
Cool to see a concrete alternative to #ReLU #backprop models that remains both interpretable and biologically grounded:
Qin et al. introduce #ReSU, Rectified Spectral Units, as a replacement for ReLU.
📄 https://arxiv.org/abs/2512.23146
#CompNeuro #NeuroAI #neuroscience 🧪
ReLU를 넘어 최신 GELU, SwiGLU, TeLU까지. 딥러닝 모델의 성능을 극대화하기 위한 활성화 함수 선택의 모든 것을 담았습니다. 은닉층부터 가장 많이 실수하는 출력층 함수 선택법까지, 명확한 가이드를 제시합니다.
#LLM #ReLU #Sigmoid #SwiGLU #Swish #Tanh #TeLU #Transformer #딥러닝 #머신러닝 #인공신경망 #활성화함수
doyouknow.kr/742/activati...
Activation Functions: The 'Secret Sauce' of Deep Learning
techlife.blog/posts/activa... #ActivationFunctions #DeepLearning #NeuralNetworks
#ReLU #GELU #SwiGLU #Transformers #MachineLearning
Sharp Lower Bounds for Linearized ReLU^k Approximation on Sphere
A saturation theorem shows ReLU^k networks on the sphere cannot beat an L2 error rate of n^{-(d+2k+1)/(2d)} when smoothness r > (d+2k+1)/2. 5 Oct 2025. Read more: getnews.me/sharp-lower-bounds-for-l... #relu #saturationtheorem #approximation
Gradient Descent Attains Optimal Rates for Deep ReLU Networks
Gradient descent achieves near‑minimax generalisation on deep ReLU networks with excess risk ~O(L⁴(1+γL²)/(nγ²)), a polynomial depth dependence. Accepted at NeurIPS 2025. Read more: getnews.me/gradient-descent-attains... #gradientdescent #relu
Sobolev Training Boosts Convergence of ReLU Neural Networks
Sobolev training adds a derivative term to the loss and study shows it speeds convergence for shallow ReLU networks with Gaussian inputs. Experiments show fewer epochs for lower error. Read more: getnews.me/sobolev-training-boosts-... #sobolevtraining #relu
When Are Bias-Free ReLU Networks Effectively Linear Networks?
Yedi Zhang, Andrew M Saxe, Peter E. Latham
Action editor: Bruno Loureiro
https://openreview.net/forum?id=Ucpfdn66k2
#relu #bias #deep
Creative commons Photo d'un soleil levant au rond presque fuchsia mais petit, bien centré gauche-droite, mais la ligne d'horizon est sur le tiers supérieur, laissant la place à l'océan avec une vague grosse et douce qui fait une ligne au milieu. Les nuages sont dorés, l'horizon orange, l'eau est gris-bleu et reflète les couleurs du ciel.
Le #soleil de #Tubingen porte son châle de nuage, c'est pourquoi je pioche une #photo de l'Extrême-Orient pour vous montrez comment je me sens après avoir #relu le pdf final de #Apesitter
L'impression de mon #roman approche et je #surf d'enthousiasme. Une #aventure commence
#asie #littérature #jp
#loopzoop #alttext #LupinIII #luzeni what's the Rebecca x Lupin ship called? #Rebin? #ReLu? #LuBecca?
mix batch of random old ReLu doodles Part2.
🍎 #LuciExpos #RefiLuci #ReLu 🍏
I put together a mix batch of random old ReLu doodles
🍎 #LuciExpos #RefiLuci #ReLu 🍏
A pretty old doodle skit-
Their status: complicated
🍎 #LuciExpos ? #RefiLuci #Relu ?🍏
On Space Folds of ReLU Neural Networks
Michal Lewandowski, Hamid Eghbalzadeh, Bernhard Heinzl, Raphael Pisoni, Bernhard A. Moser
Action editor: Petar Veličković
https://openreview.net/forum?id=RfFqBXLDQk
#cantornet #similarity #relu
What’s the one formula you always need to look up? Comment below and follow for more!
#GradientDescent #NormalDistribution #MachineLearning #DataScience #AI #DeepLearning #NaiveBayes #LinearRegression #SVM #ReLU #Softmax
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2024)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2024)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2024)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2024)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2023)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2023)
🔔 #JollyHec #LuciExpos #RefiLuci #ReLu 🍎🍏
Repost (2023)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2023)
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2022)
🔔 #JollyHec
🍎 #LuciExpos #RefiLuci #ReLu 🍏
Repost (2022)
🍎 #LuciExpos #RefiLuci #ReLu 🍏