Timezone: »

 
Poster
Learning sparse inverse covariance matrices in the presence of confounders
Oliver Stegle · Christoph Lippert · Joris M Mooij · Neil D Lawrence · Karsten Borgwardt

Wed Dec 14 08:45 AM -- 02:59 PM (PST) @

Inference in matrix-variate Gaussian models has major applications for multi-output prediction and joint learning of row and column covariances from matrix-variate data. Here, we discuss an approach for efficient inference in such models that explicitly account for \iid observation noise. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance between samples. We show practical utility on applications to biology, where we model covariances with more than 100,000 dimensions. We find greater accuracy in recovering biological network structures and are able to better reconstruct the confounders.

Author Information

Oliver Stegle (German Cancer Research Center)
Christoph Lippert (Human Longevity, Inc.)
Joris M Mooij (University of Amsterdam)
Neil D Lawrence (University of Cambridge)
Karsten Borgwardt (ETH Zurich)

Karsten Borgwardt is Professor of Data Mining at ETH Zürich, at the Department of Biosystems located in Basel. His work has won several awards, including the NIPS 2009 Outstanding Paper Award, the Krupp Award for Young Professors 2013 and a Starting Grant 2014 from the ERC-backup scheme of the Swiss National Science Foundation. Since 2013, he is heading the Marie Curie Initial Training Network for "Machine Learning for Personalized Medicine" with 12 partner labs in 8 countries (http://www.mlpm.eu). The business magazine "Capital" listed him as one of the "Top 40 under 40" in Science in/from Germany in 2014, 2015 and 2016. For more information, visit: https://www.bsse.ethz.ch/mlcb

More from the Same Authors