Humans and neural networks show similar patterns of transfer and interference
nature.com·16h·
Discuss: Hacker News
Flag this post

Main

Continual learning is the ability to acquire multiple tasks in succession. Learning tasks in sequence is challenging because new task acquisition may cause existing knowledge to be overwritten, a phenomenon called catastrophic interference. Artificial neural networks (ANNs) trained with gradient descent are particularly prone to catastrophic interference[1](#ref-CR1 “McCloskey, M. & Cohen, N. J. in Psychology of Learning and Motivation Vol. 24 (ed. Bower, G. H.) 109–165 (Academic, 1989); https://doi.org/10.1016/S0079-7421(08)60536-8

“),2,[3](https://www.nature.com/articles/s41562-025-02318-y#ref-CR3 “French, R. M. C…

Similar Posts

Loading similar posts...