Hi all,
This week (Wed Jan 28, at 12), for the last time this semester, we will meet for our theory seminar.
Location: Building 503 room 226.
Looking forward to seeing you all there,
Speaker: Eliad Tsfadia (BIU)
Title: Differentially Private Quasi-Concave Optimization: Bypassing the Lower Bound and Application to Geometric Problems.
Abstract: We study the sample complexity of differentially private optimization of quasi-concave functions. For a fixed input domain $X$, Cohen et al. (STOC 2023) proved that any generic private optimizer for low-sensitive quasi-concave functions must have sample complexity $\Omega(2^{\log*|X|})$.
We show that the lower bound can be bypassed for a series of "natural" problems.
We define a new class of approximated quasi-concave functions, and present a generic differentially private optimizer for approximated quasi-concave functions with sample complexity $\tilde{O}(\log*|X|)$.
As applications, we use our optimizer to privately select a center point of points in $d$ dimensions and PAC learn $d$-dimensional halfspaces. In previous works, Bun et al. (FOCS 2015) proved a lower bound of $\Omega(\log*|X|)$ for both problems. Beimel et al. (COLT 2019) and Kaplan et al. (NeurIPS 2020) gave an upper bound of $\tilde{O}(d^{2.5} 2^{\log*|X|})$ for the two problems, respectively. We improve the dependency of the upper bounds on the cardinality of the domain by presenting a new upper bound of $\tilde{O}(d^{5.5} \log*|X|)$ for both problems. To the best of our understanding, this is the first work to reduce the sample complexity dependency on $|X|$ for these two problems from exponential in $\log* |X|$ to $\log* |X|$.