Poster
Learning with Target Prior
Zuoguan Wang · Siwei Lyu · Gerwin Schalk · Qiang Ji
Harrah’s Special Events Center 2nd Floor
[
Abstract
]
Abstract:
In the conventional approaches for supervised parametric learning, relations between data and target variables are provided through training sets consisting of pairs of corresponded data and target variables. In this work, we describe a new learning scheme for parametric learning, in which the target variables $\y$ can be modeled with a prior model $p(\y)$ and the relations between data and target variables are estimated through $p(\y)$ and a set of uncorresponded data $\x$ in training. We term this method as learning with target priors (LTP). Specifically, LTP learning seeks parameter $\t$ that maximizes the log likelihood of $f_\t(\x)$ on a uncorresponded training set with regards to $p(\y)$. Compared to the conventional (semi)supervised learning approach, LTP can make efficient use of prior knowledge of the target variables in the form of probabilistic distributions, and thus removes/reduces the reliance on training data in learning. Compared to the Bayesian approach, the learned parametric regressor in LTP can be more efficiently implemented and deployed in tasks where running efficiency is critical, such as on-line BCI signal decoding. We demonstrate the effectiveness of the proposed approach on parametric regression tasks for BCI signal decoding and pose estimation from video.
Live content is unavailable. Log in and register to view live content