Home » Publication » 28273

Dettaglio pubblicazione

2024, Advances in Neural Information Processing Systems 36 (NeurIPS 2023), Pages -

On Generalization Bounds for Projective Clustering (04b Atto di convegno in volume)

Bucarelli MARIA SOFIA, Larsen Matilde, Schwiegelshohn Chris, Toftrup Mads

Given a set of points, clustering consists of finding a partition of a point set into k clusters such that the center to which a point is assigned is as close as possible. Most commonly, centers are points themselves, which leads to the famous k-median and k-means objectives. One may also choose centers to be j dimensional subspaces, which gives rise to subspace clustering. In this paper, we consider learning bounds for these problems. That is, given a set of n samples P drawn independently from some unknown, but fixed distribution D, how quickly does a solution computed on P converge to the optimal clustering of D? We give several near optimal results. In particular, 1. For center-based objectives, we show a convergence rate of ~O(√k/n). This matches the known optimal bounds of [Fefferman, Mitter, and Narayanan, Journal of the Mathematical Society 2016] and [Bartlett, Linder, and Lugosi, IEEE Trans. Inf. Theory 1998] for k-means and extends it to other important objectives such as k-median. 2. For subspace clustering with j-dimensional subspaces, we show a convergence rate of ~O(√(kj^2)/n). These are the first provable bounds for most of these problems. For the specific case of projective clustering, which generalizes k-means, we show a converge rate of Ω(√(kj)/n) is necessary, thereby proving that the bounds from [Fefferman, Mitter, and Narayanan, Journal of the Mathematical Society 2016] are essentially optimal.
Gruppo di ricerca: Theory of Deep Learning
keywords
© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma