Hi all,
Next week (Wed Dec 10, at 12) we will meet for our theory seminar.
Location: Building 503 room 226.
Looking forward to seeing you all there,
Speaker: Shay Sapir (Weizmann)
Title: Dimension Reduction for Clustering: The Curious Case of Discrete Centers
Abstract: The Johnson-Lindenstrauss transform is a fundamental method for dimension reduction in Euclidean spaces, that can map any dataset of $n$ points into dimension $O(\log n)$ with low distortion of their distances. This dimension bound is tight in general, but one can bypass it for specific problems. Indeed, tremendous progress has been made for clustering problems, especially in the \emph{continuous} setting where centers can be picked from the ambient space $\mathbb{R}^d$. Most notably, for $k$-median and $k$-means, the dimension bound was improved to $O(\log k)$ [Makarychev, Makarychev and Razenshteyn, STOC 2019].
We explore dimension reduction for clustering in the \emph{discrete} setting, where centers can only be picked from the dataset, and present two results that are both parameterized by the doubling dimension of the dataset, denoted as $\operatorname{ddim}$. The first result shows that dimension $O_{\epsilon}(\operatorname{ddim} + \log k + \log\log n)$ suffices, and is moreover tight, to guarantee that the cost is preserved within factor $1\pm\epsilon$ for every set of centers. Our second result eliminates the $\log\log n$ term in the dimension through a relaxation of the guarantee (namely, preserving the cost only for all approximately-optimal sets of centers), which maintains its usefulness for downstream applications.
Overall, we achieve strong dimension reduction in the discrete setting, and find that it differs from the continuous setting not only in the dimension bound, which depends on the doubling dimension, but also in the guarantees beyond preserving the optimal value, such as which clusterings are preserved.
Joint with Shaofeng Jiang, Robert Krauthgamer, Sandeep Silwal and Di Yue.