Timezone: »
The ratio of two probability densities, called a density-ratio, is a vital quantity in machine learning. In particular, a relative density-ratio, which is a bounded extension of the density-ratio, has received much attention due to its stability and has been used in various applications such as outlier detection and dataset comparison. Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities. However, sufficient instances are often unavailable in practice. In this paper, we propose a meta-learning method for relative DRE, which estimates the relative density-ratio from a few instances by using knowledge in related datasets. Specifically, given two datasets that consist of a few instances, our model extracts the datasets' information by using neural networks and uses it to obtain instance embeddings appropriate for the relative DRE. We model the relative density-ratio by a linear model on the embedded space, whose global optimum solution can be obtained as a closed-form solution. The closed-form solution enables fast and effective adaptation to a few instances, and its differentiability enables us to train our model such that the expected test error for relative DRE can be explicitly minimized after adapting to a few instances. We empirically demonstrate the effectiveness of the proposed method by using three problems: relative DRE, dataset comparison, and outlier detection.
Author Information
Atsutoshi Kumagai (NTT)
Tomoharu Iwata (NTT)
Yasuhiro Fujiwara (NTT Software Innovation Center)
More from the Same Authors
-
2022 Poster: Symplectic Spectrum Gaussian Processes: Learning Hamiltonians from Noisy and Sparse Data »
Yusuke Tanaka · Tomoharu Iwata · naonori ueda -
2022 Poster: Few-shot Learning for Feature Selection with Hilbert-Schmidt Independence Criterion »
Atsutoshi Kumagai · Tomoharu Iwata · Yasutoshi Ida · Yasuhiro Fujiwara -
2022 Poster: Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks »
Daiki Chijiwa · Shin'ya Yamaguchi · Atsutoshi Kumagai · Yasutoshi Ida -
2022 Poster: Sharing Knowledge for Meta-learning with Feature Descriptions »
Tomoharu Iwata · Atsutoshi Kumagai -
2021 Poster: Permuton-induced Chinese Restaurant Process »
Masahiro Nakano · Yasuhiro Fujiwara · Akisato Kimura · Takeshi Yamada · naonori ueda -
2021 Poster: Loss function based second-order Jensen inequality and its application to particle variational inference »
Futoshi Futami · Tomoharu Iwata · naonori ueda · Issei Sato · Masashi Sugiyama -
2019 Poster: Transfer Anomaly Detection by Inferring Latent Domain Representations »
Atsutoshi Kumagai · Tomoharu Iwata · Yasuhiro Fujiwara -
2019 Poster: Spatially Aggregated Gaussian Processes with Multivariate Areal Outputs »
Yusuke Tanaka · Toshiyuki Tanaka · Tomoharu Iwata · Takeshi Kurashima · Maya Okawa · Yasunori Akagi · Hiroyuki Toda -
2018 Poster: Sigsoftmax: Reanalysis of the Softmax Bottleneck »
Sekitoshi Kanai · Yasuhiro Fujiwara · Yuki Yamanaka · Shuichi Adachi -
2017 Poster: Preventing Gradient Explosions in Gated Recurrent Units »
Sekitoshi Kanai · Yasuhiro Fujiwara · Sotetsu Iwamura -
2016 Poster: Multi-view Anomaly Detection via Robust Probabilistic Latent Variable Models »
Tomoharu Iwata · Makoto Yamada -
2015 Poster: Cross-Domain Matching for Bag-of-Words Data via Kernel Embeddings of Latent Distributions »
Yuya Yoshikawa · Tomoharu Iwata · Hiroshi Sawada · Takeshi Yamada -
2014 Poster: Latent Support Measure Machines for Bag-of-Words Data Classification »
Yuya Yoshikawa · Tomoharu Iwata · Hiroshi Sawada