Blog

Mar 19, 2026

Why Adam May Be Hurting Your Neural Network’s Memory

This study shows that optimizer choice has a major impact on catastrophic forgetting in neural networks, with SGD consistently outperforming Adam and RMSProp. While hyperparameters influence outcomes, the optimizer itself plays a larger role. The findings also reveal that commonly used metrics like activation overlap may not reliably explain forgetting, highlighting the need for multi-metric evaluation—especially retention and relearning—in continual learning systems.

Source: HackerNoon →


Share

BTCBTC
$72,295.00
1.64%
ETHETH
$2,220.58
1.93%
USDTUSDT
$1.00
0.02%
XRPXRP
$1.34
1.04%
BNBBNB
$602.67
0.08%
USDCUSDC
$1.000
0.02%
SOLSOL
$84.09
2.35%
TRXTRX
$0.318
0.04%
FIGR_HELOCFIGR_HELOC
$1.03
0.15%
DOGEDOGE
$0.0930
1.68%
USDSUSDS
$1.000
0.01%
WBTWBT
$52.63
0.13%
HYPEHYPE
$41.48
6.45%
ADAADA
$0.252
0.77%
LEOLEO
$10.12
0.02%
BCHBCH
$442.37
0.52%
LINKLINK
$9.02
2.9%
XMRXMR
$344.75
4.11%
ZECZEC
$379.11
21.06%
USDEUSDE
$0.999
0.02%