跳到主要內容區塊
:::
A- A A+

博士後演講公告

:::

Decomposing End-to-End Backpropagation into Small Components with Independent Objective Functions

  • 2021-04-28 (Wed.), 14:00 PM
  • 中研院-統計所 6005會議室(環境變遷研究大樓A棟)
  • 茶 會:下午15:00統計所6005會議室(環境變遷研究大樓A棟)
  • Prof. Hung-Hsuan Chen (陳弘軒 教授)
  • Computer Science & Information Engineering, National Central University (中央大學 資訊工程學系)

Abstract

Backpropagation (BP) is the cornerstone of today’s deep learning algorithms, but it is inefficient partially because of backward locking, which means updating the weights of one layer locks the weight updates in the other layers. Consequently, it is challenging to apply parallel computing or a pipeline structure to update the weights in different layers simultaneously. In this talk, I will introduce a learning structure called associated learning (AL), which modularizes the network into smaller components, each of which has a local objective. Because the objectives are mutually independent, AL can learn the parameters in different layers independently and simultaneously, so it is feasible to apply a pipeline structure to improve the training throughput. Surprisingly, even though most of the parameters in AL do not directly interact with the target variable, training deep models by this method yields accuracies comparable to those from models trained using typical BP methods, in which all parameters are used to predict the target variable. Our implementation is available at https://github.com/SamYWK/Associated_Learning

最後更新日期:
回頁首