Timezone: »
Diffusion models have recently shown remarkable progress, demonstrating state-of-the-art image generation qualities. Like the other high-fidelity generative models, diffusion models require a large amount of data and computing time for stable training, which hinders the application of diffusion models for limited data settings. To overcome this issue, one can employ a pre-trained diffusion model built on a large-scale dataset and fine-tune it on a target dataset. Unfortunately, as we show empirically, this easily results in overfitting. In this paper, we propose an efficient fine-tuning algorithm for diffusion models that can efficiently and robustly train on limited data settings. We first show that fine-tuning only the small subset of the pre-trained parameters can efficiently learn the target dataset with much less overfitting. Then we further introduce a lightweight adapter module that can be attached to the pre-trained model with minimal overhead and show that fine-tuning with our adapter module significantly improves the image generation quality. We demonstrate the effectiveness of our method on various real-world image datasets.
Author Information
Taehong Moon (Korea Advanced Institute of Science and Technology)
Moonseok Choi (KAIST, Korea Advanced Institute of Science and Technology)
Gayoung Lee (NAVER)
Jung-Woo Ha (NAVER CLOVA AI Lab)

- Head, AI Innovation, NAVER Cloud - Research Fellow, NAVER AI Lab - Datasets and Benchmarks Co-Chair, NeurIPS 2023 - Socials Co-Chair, ICML 2023 - Socials Co-Chair, NeurIPS 2022 - BS, Seoul National University - PhD, Seoul National University
Juho Lee (KAIST, AITRICS)
More from the Same Authors
-
2021 : KLUE: Korean Language Understanding Evaluation »
Sungjoon Park · Jihyung Moon · Sungdong Kim · Won Ik Cho · Ji Yoon Han · Jangwon Park · Chisung Song · Junseong Kim · Youngsook Song · Taehwan Oh · Joohong Lee · Juhyun Oh · Sungwon Lyu · Younghoon Jeong · Inkwon Lee · Sangwoo Seo · Dongjun Lee · Hyunwoo Kim · Myeonghwa Lee · Seongbo Jang · Seungwon Do · Sunkyoung Kim · Kyungtae Lim · Jongwon Lee · Kyumin Park · Jamin Shin · Seonghyun Kim · Lucy Park · Alice Oh · Jung-Woo Ha · Kyunghyun Cho -
2022 Poster: On Divergence Measures for Bayesian Pseudocoresets »
Balhae Kim · Jungwon Choi · Seanie Lee · Yoonho Lee · Jung-Woo Ha · Juho Lee -
2022 Poster: Set-based Meta-Interpolation for Few-Task Meta-Learning »
Seanie Lee · Bruno Andreis · Kenji Kawaguchi · Juho Lee · Sung Ju Hwang -
2021 Poster: Diversity Matters When Learning From Ensembles »
Giung Nam · Jongmin Yoon · Yoonho Lee · Juho Lee -
2021 Poster: Metropolis-Hastings Data Augmentation for Graph Neural Networks »
Hyeonjin Park · Seunghun Lee · Sihyeon Kim · Jinyoung Park · Jisu Jeong · Kyung-Min Kim · Jung-Woo Ha · Hyunwoo Kim -
2021 Poster: Mini-Batch Consistent Slot Set Encoder for Scalable Set Encoding »
Bruno Andreis · Jeffrey Willette · Juho Lee · Sung Ju Hwang -
2021 Social: ML in Korea »
Jung-Woo Ha -
2020 Poster: Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs »
Dasol Hwang · Jinyoung Park · Sunyoung Kwon · KyungMin Kim · Jung-Woo Ha · Hyunwoo Kim -
2020 Poster: Bootstrapping neural processes »
Juho Lee · Yoonho Lee · Jungtaek Kim · Eunho Yang · Sung Ju Hwang · Yee Whye Teh -
2020 Social: NeurIPS 2020 Social ML in Korea »
Jung-Woo Ha -
2020 Poster: Neural Complexity Measures »
Yoonho Lee · Juho Lee · Sung Ju Hwang · Eunho Yang · Seungjin Choi -
2019 : Coffee Break & Poster Session 2 »
Juho Lee · Yoonho Lee · Yee Whye Teh · Raymond A. Yeh · Yuan-Ting Hu · Alex Schwing · Sara Ahmadian · Alessandro Epasto · Marina Knittel · Ravi Kumar · Mohammad Mahdian · Christian Bueno · Aditya Sanghi · Pradeep Kumar Jayaraman · Ignacio Arroyo-Fernández · Andrew Hryniowski · Vinayak Mathur · Sanjay Singh · Shahrzad Haddadan · Vasco Portilheiro · Luna Zhang · Mert Yuksekgonul · Jhosimar Arias Figueroa · Deepak Maurya · Balaraman Ravindran · Frank NIELSEN · Philip Pham · Justin Payan · Andrew McCallum · Jinesh Mehta · Ke SUN -
2019 : Contributed Talk - Towards deep amortized clustering »
Juho Lee · Yoonho Lee · Yee Whye Teh -
2018 Poster: Uncertainty-Aware Attention for Reliable Interpretation and Prediction »
Jay Heo · Hae Beom Lee · Saehoon Kim · Juho Lee · Kwang Joon Kim · Eunho Yang · Sung Ju Hwang -
2018 Poster: DropMax: Adaptive Variational Softmax »
Hae Beom Lee · Juho Lee · Saehoon Kim · Eunho Yang · Sung Ju Hwang -
2017 : Posters and Coffee »
Jean-Baptiste Tristan · Yunseong Lee · Anna Veronika Dorogush · Shohei Hido · Michael Terry · Mennatullah Siam · Hidemoto Nakada · Cody Coleman · Jung-Woo Ha · Hao Zhang · Adam Stooke · Chen Meng · Christopher Kappler · Lane Schwartz · Christopher Olston · Sebastian Schelter · Minmin Sun · Daniel Kang · Waldemar Hummer · Jichan Chung · Tim Kraska · Kannan Ramchandran · Nick Hynes · Christoph Boden · Donghyun Kwak -
2017 Poster: Overcoming Catastrophic Forgetting by Incremental Moment Matching »
Sang-Woo Lee · Jin-Hwa Kim · Jaehyun Jun · Jung-Woo Ha · Byoung-Tak Zhang -
2017 Spotlight: Overcoming Catastrophic Forgetting by Incremental Moment Matching »
Sang-Woo Lee · Jin-Hwa Kim · Jaehyun Jun · Jung-Woo Ha · Byoung-Tak Zhang -
2016 Poster: Finite-Dimensional BFRY Priors and Variational Bayesian Inference for Power Law Models »
Juho Lee · Lancelot F James · Seungjin Choi -
2015 Poster: Tree-Guided MCMC Inference for Normalized Random Measure Mixture Models »
Juho Lee · Seungjin Choi