Timezone: »
We propose a general method to construct centroid approximation for the distribution of maximum points of a random function (a.k.a. argmax distribution), which finds broad applications in machine learning. Our method optimizes a set of centroid points to compactly approximate the argmax distribution with a simple objective function, without explicitly drawing exact samples from the argmax distribution. Theoretically, the argmax centroid method can be shown to minimize a surrogate of Wasserstein distance between the ground-truth argmax distribution and the centroid approximation under proper conditions. We demonstrate the applicability and effectiveness of our method on a variety of real-world multi-task learning applications, including few-shot image classification, personalized dialogue systems and multi-target domain adaptation.
Author Information
Chengyue Gong (Peking University)
Mao Ye (The University of Texas at Austin)
Qiang Liu (Dartmouth College)
More from the Same Authors
-
2021 Spotlight: Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent »
Xingchao Liu · Xin Tong · Qiang Liu -
2022 : BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach »
Mao Ye · Bo Liu · Stephen Wright · Peter Stone · Qiang Liu -
2022 : Diffusion-based Molecule Generation with Informative Prior Bridges »
Chengyue Gong · Lemeng Wu · Xingchao Liu · Mao Ye · Qiang Liu -
2022 : HotProtein: A Novel Framework for Protein Thermostability Prediction and Editing »
Tianlong Chen · Chengyue Gong · Daniel Diaz · Xuxi Chen · Jordan Wells · Qiang Liu · Zhangyang Wang · Andrew Ellington · Alex Dimakis · Adam Klivans -
2022 : First hitting diffusion models »
Mao Ye · Lemeng Wu · Qiang Liu -
2022 : Neural Volumetric Mesh Generator »
Yan Zheng · Lemeng Wu · Xingchao Liu · Zhen Chen · Qiang Liu · Qixing Huang -
2022 : Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow »
Xingchao Liu · Chengyue Gong · Qiang Liu -
2022 : Let us Build Bridges: Understanding and Extending Diffusion Generative Models »
Xingchao Liu · Lemeng Wu · Mao Ye · Qiang Liu -
2022 Poster: First Hitting Diffusion Models for Generating Manifold, Graph and Categorical Data »
Mao Ye · Lemeng Wu · Qiang Liu -
2022 Poster: Sampling in Constrained Domains with Orthogonal-Space Variational Gradient Descent »
Ruqi Zhang · Qiang Liu · Xin Tong -
2022 Poster: BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach »
Bo Liu · Mao Ye · Stephen Wright · Peter Stone · Qiang Liu -
2022 Poster: Diffusion-based Molecule Generation with Informative Prior Bridges »
Lemeng Wu · Chengyue Gong · Xingchao Liu · Mao Ye · Qiang Liu -
2021 Poster: Conflict-Averse Gradient Descent for Multi-task learning »
Bo Liu · Xingchao Liu · Xiaojie Jin · Peter Stone · Qiang Liu -
2021 Poster: Sampling with Trusthworthy Constraints: A Variational Gradient Framework »
Xingchao Liu · Xin Tong · Qiang Liu -
2021 Poster: Automatic and Harmless Regularization with Constrained and Lexicographic Optimization: A Dynamic Barrier Approach »
Chengyue Gong · Xingchao Liu · Qiang Liu -
2021 Poster: Profiling Pareto Front With Multi-Objective Stein Variational Gradient Descent »
Xingchao Liu · Xin Tong · Qiang Liu -
2020 Poster: Implicit Regularization and Convergence for Weight Normalization »
Xiaoxia Wu · Edgar Dobriban · Tongzheng Ren · Shanshan Wu · Zhiyuan Li · Suriya Gunasekar · Rachel Ward · Qiang Liu -
2020 Poster: Stein Self-Repulsive Dynamics: Benefits From Past Samples »
Mao Ye · Tongzheng Ren · Qiang Liu -
2020 Poster: Black-Box Certification with Randomized Smoothing: A Functional Optimization Based Framework »
Dinghuai Zhang · Mao Ye · Chengyue Gong · Zhanxing Zhu · Qiang Liu -
2020 Poster: Greedy Optimization Provably Wins the Lottery: Logarithmic Number of Winning Tickets is Enough »
Mao Ye · Lemeng Wu · Qiang Liu -
2018 Poster: FRAGE: Frequency-Agnostic Word Representation »
Chengyue Gong · Di He · Xu Tan · Tao Qin · Liwei Wang · Tie-Yan Liu -
2017 Poster: Deep Dynamic Poisson Factorization Model »
Chengyue Gong · win-bin huang