# Algorithms and Theory Seminar

Weekly Algorithms and Theory seminars are held in Hariri seminar room on Mondays from 1 pm to 2 pm for the Spring 2020 semester, with sandwiches available at 12:45 pm. Below is a list of seminar dates with information on the speakers:

 Date Speaker Talk Title January 27 Yannai A. Gonczarowski (Microsoft Research) The Menu-Size of Approximately Optimal Auctions February 3 Jinshuo Dong (University of Pennsylvania) How private are private algorithms? – A privacy CLT February 10 Maryam Aliakbarpour (MIT) Distribution testing and its new extensions February 24 Marika Swanberg A look at history independence: challenges, solutions, and applications March 16 Meghna Sengupta March 23 Nir Weinberger (MIT) March 30 Andrew Suh April 6 Gavin Brown April 13 April 22 (Wed) April 27

### Abstracts

Speaker: Yannai A. Gonczarowski

Title: The Menu-Size of Approximately Optimal Auctions

In this talk, we resolve both of these questions. We first show that for every n and for every ε>0, there exists a menu-size bound C=C(n,ε) such that auctions of menu size at most C suffice for extracting a (1-ε) fraction of the optimal revenue from any valuation distributions, and give an upper bound on C(n,ε), even when the valuation distributions are unbounded and nonidentical. We then proceed to giving two lower bounds, which hold even for bounded i.i.d. distributions: one on the dependence on n when ε=1/n and n grows large, and the other on the dependence on ε when n is fixed and ε grows small. Finally, we apply these upper and lower bounds to derive results regarding the deterministic communication complexity required to run an auction that achieves such an approximation.

Based upon:
* The Menu-Size Complexity of Revenue Approximation, Moshe Babaioff, Y. A. G., and Noam Nisan, STOC 2017
* Bounding the Menu-Size of Approximately Optimal Auctions via Optimal-Transport Duality, Y. A. G., STOC 2018

Speaker: Jinshuo Dong

Title: How private are private algorithms? – A privacy CLT

Abstract: It is important to understand the exact privacy guarantee provided by private algorithms developed in the past prosperous decade of differential privacy. In particular, any underestimation of actual privacy means unnecessary noise in the algorithm and loss in the final accuracy. We observe a central limit behavior in iterative private algorithms, which demonstrates the limit of the common $(\varepsilon,\delta)$ parametrization in most application scenarios. For the rescue, a new notion called Gaussian Differential Privacy (GDP) is proposed and a complete toolkit is developed. We carry out various experiments to show how much unnecessary loss of accuracy can be saved in deep learning applications. Based on joint work with Aaron Roth, Weijie Su, Zhiqi Bu and Qi Long.

Speaker: Maryam Aliakbarpour

Title: Distribution testing and its new extensions

Abstract: One of the most fundamental problems in learning theory is to view input data as random samples from an unknown distribution and make statistical inferences about the underlying distribution. In this talk, we focus on a notable example of such statistical task: testing properties of distributions. The goal is to design an algorithm that uses as few samples as possible from a distribution and distinguishes whether the distribution has the property, or it is $\epsilon$-far in $\ell_1$-distance from any distribution which has the property. In this talk, we explore several questions in the framework of distribution testing, such as (i) Is the distribution uniform? Or, is it far from being uniform? (ii) Is a pair of random variables independent or correlated? (iii) Is the distribution monotone? Moreover, we discuss extensions of the standard testing framework to more practical settings. For instance, we consider the case where the sensitivity of the input samples (e.g., patients’ medical records) requires the design of statistical tests that ensure the privacy of the individuals. We address this case by designing differentially private testing algorithms for several testing questions with (nearly)-optimal sample complexities.

Speaker: Marika Swanberg

Title: A look at history independence: challenges, solutions, and applications

Abstract: Implementations of data structures often unintentionally leak information which is not available via the standard interface–for example, the order that elements were inserted, whether an element was deleted, and more. In this talk, I will introduce history independent data structures, a formal guarantee that limits such leakage. I will then describe and analyze a specific history independent hashing scheme developed by Guy Blelloch and Daniel Golovin, including their interesting proof of history independence. Lastly, I explain some of the core challenges of implementing efficient history independent data structures and general solutions that I have observed in the literature. Time permitting, I will connect this to my larger research project of formalizing the EU data protection law called “the right to be forgotten.”

Past Seminars