Artificial neural networks learn best when they spend time learning nothing

Summary: “Offline” periods during AI training mitigated “catastrophic forgetting” in artificial neural networks, mimicking the learning benefits sleep provides in the human brain.

Font: UCSD

Depending on age, humans need 7 to 13 hours of sleep every 24 hours. During this time, a lot happens: heart rate, breathing, and metabolism come and go; hormone levels adjust; the body relaxes. Not so much in the brain.

“The brain is very busy when we sleep, repeating what we’ve learned throughout the day,” said Maxim Bazhenov, PhD, professor of medicine and sleep researcher at the University of California San Diego School of Medicine. “Sleep helps reorganize memories and presents them in the most efficient way.”

In previously published work, Bazhenov and colleagues reported how sleep builds rational memory, the ability to recall arbitrary or indirect associations between objects, people, or events, and protects against forgetting old memories.

Artificial neural networks take advantage of the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. Somehow, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect: When artificial neural networks learn sequentially, new information overwrites old information, a phenomenon called catastrophic forgetting.

“In contrast, the human brain continually learns and incorporates new data into existing knowledge,” Bazhenov said, “and typically learns best when new training is interspersed with periods of sleep for memory consolidation.”

Writing in the November 18, 2022 issue of PLOS Computational Biology, Lead author Bazhenov and colleagues discuss how biological models can help mitigate the threat of catastrophic forgetting in artificial neural networks, increasing their utility across a spectrum of research interests.

The scientists used spike neural networks that artificially mimic natural neural systems: instead of information being continuously communicated, it is transmitted as discrete events (spikes) at certain points in time.

They found that when the spike networks were trained on a new task, but with occasional offline periods mimicking sleep, catastrophic forgetfulness was mitigated. Like the human brain, the study authors said, “sleeping” for the networks allowed them to replay old memories without explicitly using old training data.

Memories are represented in the human brain by patterns of synaptic weight: the strength or breadth of a connection between two neurons.

“When we learn new information,” Bazhenov said, “neurons fire in a specific order, and this increases the synapses between them. During sleep, the spike patterns learned during our waking state are spontaneously repeated. It is called reactivation or repetition.

This shows four computer generated brains.
Artificial neural networks take advantage of the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. The image is in the public domain.

“Synaptic plasticity, the ability to be altered or shaped, is still present during sleep and can further enhance synaptic weight patterns that represent memory, helping prevent forgetfulness or enabling knowledge transfer from old tasks. to new”.

When Bazhenov and his colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting.

“It meant that these networks could continuously learn, just like humans or animals. Understanding how the human brain processes information during sleep can help boost memory in human subjects. Increasing sleep rhythms can lead to better memory.

“In other projects, we use computer models to develop optimal strategies for delivering stimulation during sleep, such as auditory tones, which improve sleep rhythms and enhance learning. This may be particularly important when memory is not optimal, such as when memory declines with aging or in some conditions such as Alzheimer’s disease.”

Co-authors include: Ryan Golden and Jean Erik Delanois, both from UC San Diego; and Pavel Sanda, Institute of Computer Science of the Czech Academy of Sciences.

About this AI and learning research news

Author: scott lafee
Font: UCSD
Contact: Scott LaFee–UCSD
Image: The image is in the public domain.

See also

This shows a girl and a dog.

original research: Open access.
Sleep prevents catastrophic forgetfulness in active neural networks by forming a joint synaptic weight representationby Maxim Bazhenov et al. PLOS Computational Biology


Summary

Sleep prevents catastrophic forgetfulness in active neural networks by forming a joint synaptic weight representation

Artificial neural networks overwrite previously learned tasks when sequentially trained, a phenomenon known as catastrophic forgetting. Rather, the brain is continually learning, and generally learns best when new training is interspersed with periods of sleep for memory consolidation.

Here we use the spike network to study the mechanisms behind catastrophic forgetting and the role of sleep in preventing it.

The network could be trained to learn a complex foraging task, but exhibited catastrophic forgetfulness when sequentially trained on different tasks. In the synaptic weight space, training new tasks moved the synaptic weight configuration away from the variety that represented the previous task leading to forgetting.

Intertwining new task training with periods of offline reactivation, mimicking biological sleep, mitigating catastrophic forgetfulness by restricting the synaptic weight state of the network to the previously learned variety, while allowing weight configuration to converge toward the intersection of manifolds representing old and new tasks. .

The study reveals a possible synaptic weight dynamics strategy that the brain applies during sleep to prevent forgetting and optimize learning.

Leave a Reply

Your email address will not be published. Required fields are marked *