In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: When artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called catastrophic forgetting. .
Writing in the November 18, 2022 issue of PLOS Computational Biology, senior author Bazhenov and colleagues discuss how biological models may help mitigate the threat of catastrophic forgetting in artificial neural networks, boosting their utility across a spectrum of research interests. .They found that when the spiking networks were trained on a new task, but with occasional off-line periods that mimicked sleep, catastrophic forgetting was mitigated.Memories are represented in the human brain by patterns of synaptic weight — the strength or amplitude of a connection between two neurons. .“Synaptic plasticity, the capacity to be altered or molded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks.”.When Bazhenov and colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting. .“Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation” by Maxim Bazhenov et al
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representationArtificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgettingHere we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing itInterleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasksThe study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning