Timezone: »
Algorithms for clustering points in metric spaces is a long-studied area of research. Clustering has seen a multitude of work both theoretically, in understanding the approximation guarantees possible for many objective functions such as k-median and k-means clustering, and experimentally, in finding the fastest algorithms and seeding procedures for Lloyd's algorithm. The performance of a given clustering algorithm depends on the specific application at hand, and this may not be known up front. For example, a "typical instance" may vary depending on the application, and different clustering heuristics perform differently depending on the instance.
In this paper, we define an infinite family of algorithms generalizing Lloyd's algorithm, with one parameter controlling the the initialization procedure, and another parameter controlling the local search procedure. This family of algorithms includes the celebrated k-means++ algorithm, as well as the classic farthest-first traversal algorithm. We design efficient learning algorithms which receive samples from an application-specific distribution over clustering instances and learn a near-optimal clustering algorithm from the class. We show the best parameters vary significantly across datasets such as MNIST, CIFAR, and mixtures of Gaussians. Our learned algorithms never perform worse than k-means++, and on some datasets we see significant improvements.
Author Information
Maria-Florina Balcan (Carnegie Mellon University)
Travis Dick (Carnegie Mellon University)
Colin White (Carnegie Mellon University)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Spotlight: Data-Driven Clustering via Parameterized Lloyd's Families »
Tue Dec 4th 09:50 -- 09:55 PM Room Room 517 CD
More from the Same Authors
-
2020 Poster: A Study on Encodings for Neural Architecture Search »
Colin White · Willie Neiswanger · Sam Nolen · Yash Savani -
2020 Spotlight: A Study on Encodings for Neural Architecture Search »
Colin White · Willie Neiswanger · Sam Nolen · Yash Savani -
2020 Poster: Intra-Processing Methods for Debiasing Neural Networks »
Yash Savani · Colin White · Naveen Sundar Govindarajulu -
2019 Poster: Envy-Free Classification »
Maria-Florina Balcan · Travis Dick · Ritesh Noothigattu · Ariel Procaccia -
2019 Poster: Adaptive Gradient-Based Meta-Learning Methods »
Mikhail Khodak · Maria-Florina Balcan · Ameet Talwalkar -
2017 Poster: Sample and Computationally Efficient Learning Algorithms under S-Concave Distributions »
Maria-Florina Balcan · Hongyang Zhang -
2016 Poster: Noise-Tolerant Life-Long Matrix Completion via Adaptive Sampling »
Maria-Florina Balcan · Hongyang Zhang -
2016 Poster: Sample Complexity of Automated Mechanism Design »
Maria-Florina Balcan · Tuomas Sandholm · Ellen Vitercik