Artificial Neural Networks Learn Better When They Spend Time Not Learning at All - Neuroscience News

Artificial Neural Networks Learn Better When They Spend Time Not Learning at All - Neuroscience News

Artificial Neural Networks Learn Better When They Spend Time Not Learning at All - Neuroscience News
Nov 20, 2022 1 min, 51 secs

Summary: “Off-line” periods during AI training mitigated “catastrophic forgetting” in artificial neural networks, mimicking the learning benefits sleep provides in the human brain.

Artificial neural networks leverage the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media.

In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: When artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called catastrophic forgetting. .

Writing in the November 18, 2022 issue of PLOS Computational Biology, senior author Bazhenov and colleagues discuss how biological models may help mitigate the threat of catastrophic forgetting in artificial neural networks, boosting their utility across a spectrum of research interests. .

They found that when the spiking networks were trained on a new task, but with occasional off-line periods that mimicked sleep, catastrophic forgetting was mitigated.

Memories are represented in the human brain by patterns of synaptic weight — the strength or amplitude of a connection between two neurons. .

“Synaptic plasticity, the capacity to be altered or molded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks.”.

When Bazhenov and colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting. .

“Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation” by Maxim Bazhenov et al

Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation

Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting

Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it

Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks

The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning

1 Houston boil water notice: Tap water is safe to drink again after TCEQ cleared water samples, city announces - KTRK-TV

Nov 29, 2022 # politics 49 secs

2 ‘It’s horrendous’: Helena Bonham Carter defends JK Rowling and Johnny Depp - The Guardian

Nov 28, 2022 # entertainment 39 secs

3 81+ Cyber Monday Deals Under $25 at Amazon, Best Buy, Target and More - CNET

Nov 28, 2022 # politics 55 secs

4 Artemis 1's Orion spacecraft watches Earth rise over the shadowed moon (video) - Space.com

Nov 23, 2022 # science 1 min, 6 secs

5 ‘Top Gun: Maverick’ Flying Back To Theaters - Deadline

Nov 29, 2022 # entertainment 37 secs

6 Simple steps to improve your new or old TV’s picture quality - New York Post

Nov 27, 2022 # technology 46 secs

7 Brazilian regulator seizes iPhones from retail stores as Apple fails to comply with charger requirement - 9to5Mac

Nov 24, 2022 # technology 1 min, 0 secs

8 Princess Connect! Re: Dive Brings in the New Year Early! - Crunchyroll News

Nov 28, 2022 # technology 2 mins, 3 secs

9 This Is Actually The Worst Beverage To Drink Every Morning If You Want To Lose Weight - Yahoo Life

Nov 27, 2022 # health 1 min, 19 secs

10 Battlefield 2042's Season 3 Makes A Good Game Slightly Better - Kotaku

Nov 25, 2022 # technology 59 secs

11 China Covid: Police clamp down after days of protests - BBC

Nov 29, 2022 # politics 1 min, 37 secs

12 NeNe Leakes' Son Brentt Leaves Hospital After Suffering Stroke Just in Time for Thanksgiving - E! NEWS

Nov 24, 2022 # entertainment 44 secs

13 Free PlayStation Plus PS5 And PS4 Games For December 2022 Leaked - GameSpot

Nov 29, 2022 # technology 28 secs

14 Snowy Kyiv grapples with power outages amid fears of new attacks - Reuters.com

Nov 27, 2022 # politics 1 min, 3 secs

15 Randi Weingarten: Mike Pompeo knows better, but he's doing it to win a GOP primary

Nov 23, 2022 # breaking 7 secs

16 Three MILLION people line NYC streets for Macy's Thanksgiving Day parade - Daily Mail

Nov 24, 2022 # entertainment 2 mins, 22 secs

Get monthly updates and free resources.


© Copyright 2022 365NEWSX - All RIGHTS RESERVED