ECE Seminar: Vincent Y. F. Tan
- Starts: 3:00 pm on Tuesday, October 31, 2017
- Ends: 4:00 pm on Tuesday, October 31, 2017
Monday, October 30th at 3pm for PHO 339 ECE Seminar: Vincent Y. F. Tan Vincent Y. F. Tan Assistant Professor Dept. of Electrical and Computer Engineering and Dept. of Mathematics National University of Singapore Light refreshments will be available outside of PHO 339 at 2:45 Title: Minimum Rates of Approximate Sufficient Statistics Abstract: Given a sufficient statistic for a parametric family of distributions, one can estimate the parameter without access to the data. However, the memory or code size for storing the sufficient statistic may nonetheless still be prohibitive. Indeed, for n independent samples drawn from a k-nomial distribution with d = k - 1 degrees of freedom, the length of the code scales as d log n + O(1). In many applications, we may not have a useful notion of sufficient statistics (e.g., when the parametric family is not an exponential family) and we also may not need to reconstruct the generating distribution exactly. By adopting a Shannon-theoretic approach in which we allow a small error in estimating the generating distribution, we construct various approximate sufficient statistics and show that the code length can be reduced to d/2 log n + O(1). We consider errors measured according to the relative entropy and variational distance criteria. For the code constructions, we leverage Rissanen’s minimum description length principle, which yields a non-vanishing error measured according to the relative entropy. For the converse parts, we use Clarke and Barron’s formula for the relative entropy of a parametrized distribution and the corresponding mixture distribution. However, this method only yields a weak converse for the variational distance. We develop new techniques to achieve vanishing errors and we also prove strong converses. The latter means that even if the code is allowed to have a nonvanishing error, its length must still be at least d/2 log n. This is joint work with Prof. Masahito Hayashi (Nagoya University and National University of Singapore). The details can be found here. https://arxiv.org/abs/1612.02542 Bio: Vincent Y. F. Tan was born in Singapore in 1981. He is an Assistant Professor in the Department of Electrical and Computer Engineering (ECE) and the Department of Mathematics at the National University of Singapore (NUS). He received the B.A. and M.Eng. degrees in Electrical and Information Sciences from Cambridge University in 2005. He received the Ph.D. degree in Electrical Engineering and Computer Science (EECS) from the Massachusetts Institute of Technology in 2011. He was a postdoctoral researcher in the Department of ECE at the University of Wisconsin-Madison in 2011 and following that, a scientist at the Institute for Infocomm Research (I2R), A*STAR, Singapore from 2012 to 2013. His research interests include information theory, machine learning and statistical signal processing. Dr. Tan has received several awards including the MIT EECS Jin-Au Kong outstanding doctoral thesis prize in 2011; the A*STAR Philip Yeo prize for outstanding achievements in research in 2011; the NUS Young Investigator Award in 2014; and the Singapore National Research Foundation (NRF) Fellowship (Class of 2018). He was also placed in the NUS Faculty of Engineering Teaching commendation list in 2015 and 2016. He has authored a research monograph titled “Asymptotic Estimates in Information Theory with Non-Vanishing Error Probabilities” in the Foundations and Trends® in Communications and Information Theory Series (NOW Publishers). A Senior Member of the IEEE, he served as a member of the IEEE “Machine Learning for Signal Processing” Technical Committee within the IEEE Signal Processing Society. He is currently serving as an Associate Editor for the IEEE Transactions on Communications and the IEEE Transactions on Green Communications and Networking.
- 8 saint Mary's St. Boston, MA Room 339