TIGP (BIO)—Information theory framework for one-shot synaptic weight update in a real brain
- 2024-11-28 (Thu.), 14:00 PM
- 統計所B1演講廳,實體演講,不開放線上視訊
- 英文演講|講者簡介請見下方附件
- Dr. Ching-Lung Hsu(徐經倫 助研究員)
- 中央研究院生物醫學科學研究所
Abstract
Maximizing mutual information transmission is a first principle useful for neural-network computations that aim to approach appropriate input-output relationships. Today's brain-inspired learning theories are based on local correlations of presynaptic and postsynaptic activation (so-called Hebbian plasticity), and non-Hebbian mechanisms play no role in the formative frameworks.
However, I am going to argue that, from our recent discoveries based on observations of real brain circuits, non-Hebbian plasticity may have an important contribution to remarkable one-shot learning in a memory region called hippocampus. We performed experiments using single-cell electrophysiology, brain slices and mice fictively navigating in virtual reality. The findings showed that physiological mechanisms of the plasticity algorithm support a three-factor learning rule, with a complementary Hebbian and non-Hebbian nature, which can be formally derived from the principle of Information Theory. We hope this result could help update the brain-inspired understanding of learning rules in biological and artificial neural networks. Biologically, the hippocampal networks may exercise both supervised and unsupervised learning, and this new perspective provides a specific plausible implementation for both scenarios.