Timezone: »
A common assumption in many domains is that high dimensional data are a smooth nonlinear function of a small number of independent factors. When is it possible to recover the factors from unlabeled data? In the context of deep models this problem is called “disentanglement” and was recently shown to be impossible without additional strong assumptions [17, 19]. In this paper, we show that the assumption of local isometry together with non-Gaussianity of the factors, is sufficient to provably recover disentangled representations from data. We leverage recent advances in deep generative models to construct manifolds of highly realistic images for which the ground truth latent representation is known, and test whether modern and classical methods succeed in recovering the latent factors. For many different manifolds, we find that a spectral method that explicitly optimizes local isometry and non-Gaussianity consistently finds the correct latent factors, while baseline deep autoencoders do not. We propose how to encourage deep autoencoders to find encodings that satisfy local isometry and show that this helps them discover disentangled representations. Overall, our results suggest that in some realistic settings, unsupervised disentanglement is provably possible, without any domain-specific assumptions.
Author Information
Daniella Horan (Hebrew University of Jerusalem)
Eitan Richardson (The Hebrew University of Jerusalem)
Yair Weiss (Hebrew University)
Yair Weiss is an Associate Professor at the Hebrew University School of Computer Science and Engineering. He received his Ph.D. from MIT working with Ted Adelson on motion analysis and did postdoctoral work at UC Berkeley. Since 2005 he has been a fellow of the Canadian Institute for Advanced Research. With his students and colleagues he has co-authored award winning papers in NIPS (2002),ECCV (2006), UAI (2008) and CVPR (2009).
More from the Same Authors
-
2018 Poster: On GANs and GMMs »
Eitan Richardson · Yair Weiss -
2018 Spotlight: On GANs and GMMs »
Eitan Richardson · Yair Weiss -
2015 Poster: The Return of the Gating Network: Combining Generative Models and Discriminative Training in Natural Image Priors »
Dan Rosenbaum · Yair Weiss -
2015 Spotlight: The Return of the Gating Network: Combining Generative Models and Discriminative Training in Natural Image Priors »
Dan Rosenbaum · Yair Weiss -
2013 Poster: Learning the Local Statistics of Optical Flow »
Dan Rosenbaum · Daniel Zoran · Yair Weiss -
2012 Poster: Natural Images, Gaussian Mixtures and Dead Leaves »
Daniel Zoran · Yair Weiss -
2012 Poster: Learning about Canonical Views from Internet Image Collections »
Elad Mezuman · Yair Weiss -
2009 Invited Talk: Learning and Inference in Low-Level Vision »
Yair Weiss -
2009 Poster: Semi-Supervised Learning in Gigantic Image Collections »
Rob Fergus · Yair Weiss · Antonio Torralba -
2009 Oral: Semi-Supervised Learning in Gigantic Image Collections »
Rob Fergus · Yair Weiss · Antonio Torralba -
2009 Poster: The "tree-dependent components" of natural scenes are edge filters »
Daniel Zoran · Yair Weiss -
2008 Poster: Spectral Hashing »
Yair Weiss · Antonio Torralba · Rob Fergus