Institute of Statistical Science Academia Sinica [Seminar Feed] http://www.stat.sinica.edu.tw Statistics, Stat, Edu en-us Wed, 14 Apr 2021 14:50:13 +0800 http://www.stat.sinica.edu.tw/statnewsite/seminar/rss/ PHP admin@stat.sinica.edu.tw admin@stat.sinica.edu.tw Modeling for natural history of hepatitis http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2504/

Abstract

Natural history of human disease is comprised of several milestones of sequential events where these milestones are all time-to-event by nature. For example, from hepatitis C infection to death, patients may experience intermediate events such as liver cirrhosis and liver cancer. The events of hepatitis, cirrhosis, cancer and death have the sequential order and are subject to right censoring; moreover, the latter events may mask the former ones. By casting the natural history of human diseases in the framework of causal mediation modeling, we set up a mediation model with intermediate events and a terminal event, respectively as the mediators and the outcome. By introducing counterfactual counting processes, we define the intervention analogue of path-specific effects (iPSEs) as the effect of the exposure on the terminal event mediated by any combination of the intermediate events, including not through any of the events. We derive the expression of the counterfactual hazard. We construct composite nonparametric likelihood and derive a Nelson-Aalen type of estimator for the counterfactual hazard. We establish the asymptotic unbiasedness, uniform consistency and weak convergence for the proposed estimators. Numerical studies including simulation and data application of a hepatitis study conducted in Taiwan are presented to illustrate the finite sample performance and utility of the proposed method.

]]>
Thu, 8 Apr 2021 09:23:08 +0800 http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2504/
Decomposing End-to-End Backpropagation into Small Components with Independent Objective Functions http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2495/

Abstract

Backpropagation (BP) is the cornerstone of today’s deep learning algorithms, but it is inefficient partially because of backward locking, which means updating the weights of one layer locks the weight updates in the other layers. Consequently, it is challenging to apply parallel computing or a pipeline structure to update the weights in different layers simultaneously. In this talk, I will introduce a learning structure called associated learning (AL), which modularizes the network into smaller components, each of which has a local objective. Because the objectives are mutually independent, AL can learn the parameters in different layers independently and simultaneously, so it is feasible to apply a pipeline structure to improve the training throughput. Surprisingly, even though most of the parameters in AL do not directly interact with the target variable, training deep models by this method yields accuracies comparable to those from models trained using typical BP methods, in which all parameters are used to predict the target variable. Our implementation is available at https://github.com/SamYWK/Associated_Learning

]]>
Wed, 24 Feb 2021 18:00:39 +0800 http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2495/
TBD http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2502/ Thu, 18 Mar 2021 09:18:45 +0800 http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2502/ TBD http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2496/ Wed, 24 Feb 2021 18:06:17 +0800 http://www.stat.sinica.edu.tw/statnewsite/seminar/show/2496/