jump to main area
:::
A- A A+

Seminars

Large-scale Distributed Machine Learning

  • 2019-01-14 (Mon.), 10:30 AM
  • R6005, Research Center for Environmental Changes Building
  • Mr. Ching-pei Lee
  • Department of Computer Sciences, University of Wisconsin-Madison, WI, USA.

Abstract

Machine learning frequently deals with data of volume larger than the capacity of a single machine. To accommodate the training data in memory for faster access, multiple machines are often used to train machine learning models in a distributed manner. In this scenario, the per machine computational burden is reduced, but the expensive inter-machine communication becomes the bottleneck for further accelerating the training of machine learning models.? In this talk, I will introduce state-of-the-art distributed training algorithms for various machine learning tasks broadly covered by the regularized empirical risk minimization problem, including but not limited to binary and multi-class classification, regression, outlier detection, and feature selection.

Update:
scroll to top