Talk: Exponentially Faster Algorithms for Machine Learning (Yaron Singer, Harvard University)
- Starts: 11:00 am on Friday, November 9, 2018
- Ends: 1:00 pm on Friday, November 9, 2018
Many iterative methods in optimization are inherently sequential and consequently cannot be efficiently parallelized. In this talk I’ll describe a novel approach called adaptive sampling that yields algorithms whose parallel running time is exponentially faster from any previous algorithm known for a broad range of machine learning applications. The algorithms are designed for submodular function maximization which is the algorithmic engine behind machine learning applications such as ranking, speech and document summarization, recommendation systems, clustering, Bayesian inference, feature selection, network analysis, and many others. In the talk I’ll introduce the concept of adaptivity, the adaptive sampling framework we recently developed, and present experimental results from various application domains. Yaron Singer is an Assistant Professor of Computer Science at Harvard University. He was previously a postdoctoral researcher at Google Research and obtained his PhD from UC Berkeley. He is the recipient of the NSF CAREER award, the Sloan fellowship, Facebook faculty award, Google faculty award, 2012 Best Student Paper Award at the ACM conference on Web Search and Data Mining, the 2010 Facebook Graduate Fellowship, the 2009 Microsoft Research PhD Fellowship.
- Hariri Seminar Room