TIGP (BIO)—Information theory framework for one-shot synaptic weight update in a real brain
- 2024-11-28 (Thu.), 14:00 PM
- Auditorium, B1F, Institute of Statistical Science. In-person seminar, no online stream available.
- Delivered in English|Speaker bio: Please see the attachment below
- Dr. Ching-Lung Hsu
- Institute of Biomedical Sciences, Academia Sinica
Abstract
Maximizing mutual information transmission is a first principle useful for neural-network computations that aim to approach appropriate input-output relationships. Today's brain-inspired learning theories are based on local correlations of presynaptic and postsynaptic activation (so-called Hebbian plasticity), and non-Hebbian mechanisms play no role in the formative frameworks.
However, I am going to argue that, from our recent discoveries based on observations of real brain circuits, non-Hebbian plasticity may have an important contribution to remarkable one-shot learning in a memory region called hippocampus. We performed experiments using single-cell electrophysiology, brain slices and mice fictively navigating in virtual reality. The findings showed that physiological mechanisms of the plasticity algorithm support a three-factor learning rule, with a complementary Hebbian and non-Hebbian nature, which can be formally derived from the principle of Information Theory. We hope this result could help update the brain-inspired understanding of learning rules in biological and artificial neural networks. Biologically, the hippocampal networks may exercise both supervised and unsupervised learning, and this new perspective provides a specific plausible implementation for both scenarios.