Poster
Replicability in Learning: Geometric Partitions and KKM-Sperner Lemma
Jason Vander Woude · Peter Dixon · A. Pavan · Jamie Radcliffe · N. V. Vinodchandran
West Ballroom A-D #5501
Abstract:
This paper studies replicability in machine learning tasks from a geometric viewpoint. Recent works have revealed the role of geometric partitions and Sperner's lemma (and its variations) in designing replicable learning algorithms and in establishing impossibility results. A partition of is called a -secluded partition if for every , an -radius ball (with respect to the norm) centered at intersects at most members of . In relation to replicable learning, the parameter is closely related to the , and the parameter is related to the sample complexity of the replicable learner. Construction of secluded partitions with better parameters (small and large ) will lead to replicable learning algorithms with small list and sample complexities. Motivated by this connection, we undertake a comprehensive study of secluded partitions and establish near-optimal relationships between and . 1. We show that for any -secluded partition where each member has at most unit measure, it must be that , and consequently, for the interesting regime it must be that . 2. To complement this upper bound on , we show that for each and each viable , a construction of a -secluded (unit cube) partition with . This establishes the optimality of within a logarithmic factor.3. Finally, we adapt our proof techniques to obtain a new neighborhood'' variant of the cubical KKM lemma (or cubical Sperner's lemma): For any coloring of in which no color is used on opposing faces, it holds for each that there is a point where the open -radius -ball intersects at least colors. While the classical Sperner/KKM lemma guarantees the existence of a point that is "adjacent" to points with distinct colors, the neighborhood version guarantees the existence of a small neighborhood with exponentially many points with distinct colors.
Chat is not available.