Timezone: »
We propose a few-shot learning method for feature selection that can select relevant features given a small number of labeled instances. Existing methods require many labeled instances for accurate feature selection. However, sufficient instances are often unavailable. We use labeled instances in multiple related tasks to alleviate the lack of labeled instances in a target task. To measure the dependency between each feature and label, we use the Hilbert-Schmidt Independence Criterion, which is a kernel-based independence measure. By modeling the kernel functions with neural networks that take a few labeled instances in a task as input, we can encode the task-specific information to the kernels such that the kernels are appropriate for the task. Feature selection with such kernels is performed by using iterative optimization methods, in which each update step is obtained as a closed-form. This formulation enables us to directly and efficiently minimize the expected test error on features selected by a small number of labeled instances. We experimentally demonstrate that the proposed method outperforms existing feature selection methods.
Author Information
Atsutoshi Kumagai (NTT)
Tomoharu Iwata (NTT)
Yasutoshi Ida (NTT)
Yasuhiro Fujiwara (NTT Communication Science Laboratories)
More from the Same Authors
-
2022 Poster: Symplectic Spectrum Gaussian Processes: Learning Hamiltonians from Noisy and Sparse Data »
Yusuke Tanaka · Tomoharu Iwata · naonori ueda -
2022 Poster: Meta-ticket: Finding optimal subnetworks for few-shot learning within randomly initialized neural networks »
Daiki Chijiwa · Shin'ya Yamaguchi · Atsutoshi Kumagai · Yasutoshi Ida -
2022 Poster: Sharing Knowledge for Meta-learning with Feature Descriptions »
Tomoharu Iwata · Atsutoshi Kumagai -
2021 Poster: Meta-Learning for Relative Density-Ratio Estimation »
Atsutoshi Kumagai · Tomoharu Iwata · Yasuhiro Fujiwara -
2021 Poster: Loss function based second-order Jensen inequality and its application to particle variational inference »
Futoshi Futami · Tomoharu Iwata · naonori ueda · Issei Sato · Masashi Sugiyama -
2019 Poster: Fast Sparse Group Lasso »
Yasutoshi Ida · Yasuhiro Fujiwara · Hisashi Kashima -
2019 Poster: Transfer Anomaly Detection by Inferring Latent Domain Representations »
Atsutoshi Kumagai · Tomoharu Iwata · Yasuhiro Fujiwara -
2019 Poster: Spatially Aggregated Gaussian Processes with Multivariate Areal Outputs »
Yusuke Tanaka · Toshiyuki Tanaka · Tomoharu Iwata · Takeshi Kurashima · Maya Okawa · Yasunori Akagi · Hiroyuki Toda -
2016 Poster: Multi-view Anomaly Detection via Robust Probabilistic Latent Variable Models »
Tomoharu Iwata · Makoto Yamada -
2015 Poster: Cross-Domain Matching for Bag-of-Words Data via Kernel Embeddings of Latent Distributions »
Yuya Yoshikawa · Tomoharu Iwata · Hiroshi Sawada · Takeshi Yamada -
2014 Poster: Latent Support Measure Machines for Bag-of-Words Data Classification »
Yuya Yoshikawa · Tomoharu Iwata · Hiroshi Sawada