Sample Complexity and Uniform Convergence
When: Thursday, December 13, 2018
Where: PGH 232
Time: 9:30 AM
Speaker: Prof. Eli Upfal, Brown University
Host: Dr. Gopal Pandurangan
Sampling is a powerful technique, which is at the core of statistical data analysis and machine learning. Using a finite, often small, set of observations, we attempt to estimate properties of an entire sample space. How good are estimates obtained from a sample? Any rigorous application of sampling requires an understanding of the sample complexity of the problem – the minimum size sample needed to obtain the required results. In this tutorial we will cover some of the rich mathematical theory that has been developed in recent years to address this question, in particular in the context of statistical machine learning and rigorous data mining.
Main topics: Uniform convergence; VC-dimension - the ϵ-net and ϵ-sample theorems; Rademacher complexity; Applications in machine learning and data mining.
Eli Upfal is the Rush C Hawkins professor of computer science at Brown University, where he was also the department chair from 2002 to 2007. Prior to joining Brown in 1998, he was a researcher and project manager at the IBM Almaden Research Center in California, and a professor of Applied Mathematics and Computer Science at the Weizmann Institute of Science in Israel. Upfal's research focuses on the design and analysis of algorithms. In particular he is interested in randomized algorithms, probabilistic analysis of algorithms, and computational statistics, with applications ranging from combinatorial and stochastic optimization to routing and communication networks, computational biology, and computational finance. He has published over 200 research papers in scientific journals and conferences. He is co-author of a popular textbook, Probability and Computing: Randomized Algorithms and Probabilistic Analysis (with M. Mitzenmacher, Cambridge University Press 2005, 2017), and holds 13 US patents. Professor Upfal is a Fellow of the ACM and the IEEE.