Timezone: »

 
Workshop
NIPS Highlights (MLTrain), Learn How to code a paper with state of the art frameworks
Alexandros Dimakis · Nikolaos Vasiloglou · Guy Van den Broeck · Alexander Ihler · Assaf Araki

Sat Dec 09 08:00 AM -- 06:30 PM (PST) @ 202
Event URL: https://mltrain.cc »

Every year hundreds of papers are published at NIPS. Although the authors provide sound and scientific description and proof of their ideas, there is no space for explaining all the tricks and details that can make the implementation of the paper work. The goal of this workshop is to help authors evangelize their paper to the industry and expose the participants to all the Machine Learning/Artificial Intelligence know-how that cannot be found in the papers. Also the effect/importance of tuning parameters is rarely discussed, due to lack of space.
Submissions
We encourage you to prepare a poster of your favorite paper that explains graphically and at a higher level the concepts and the ideas discussed in it. You should also submit a jupyter notebook that explains in detail how equations in the paper translate to code. You are welcome to use any of the famous platforms like tensorFlow, Keras, MxNet, CNTK, etc.
For more information visit here
For more information https://www.mltrain.cc/

Sat 9:00 a.m. - 9:45 a.m. [iCal]
Lessons learned from designing Edward (Keynote)
Dustin Tran
Sat 9:45 a.m. - 10:05 a.m. [iCal]
Tips and tricks of coding papers on PyTorch (Demonstration)
Soumith Chintala
Sat 10:05 a.m. - 10:25 a.m. [iCal]
Differentiable Learning of Logical Rules for Knowledge Base Reasoning (Demonstration)
William Cohen, Fan Yang
Sat 10:45 a.m. - 11:15 a.m. [iCal]
Coding Reinforcement Learning Papers (Keynote)
Shangtong Zhang
Sat 11:15 a.m. - 11:35 a.m. [iCal]
A Linear-Time Kernel Goodness-of-Fit Test (NIPS best paper) (Demonstration)
Wittawat Jitkrittum
Sat 11:35 a.m. - 11:55 a.m. [iCal]
Imagination-Augmented Agents for Deep Reinforcement Learning (Demonstration)
Seb Racanière
Sat 11:55 a.m. - 12:15 p.m. [iCal]
Inductive Representation Learning on Large Graphs (Demonstration)
Will Hamilton
Sat 12:15 p.m. - 12:35 p.m. [iCal]
Probabilistic Programming with PYRO (Demonstration)
Noah Goodman
Sat 12:35 p.m. - 2:00 p.m. [iCal]
Poster Session (Lunch Break)
Sat 2:00 p.m. - 2:45 p.m. [iCal]

In this talk I will talk about how to easily and efficiently develop neural network models for complicated problems such as natural language processing using dynamic neural networks. First, I will briefly explain different paradigms in neural networks: static networks (e.g. TensorFlow), dynamic and eager (e.g. PyTorch), and dynamic and lazy (e.g. DyNet). I will discuss about how to efficiently implement models within dynamic neural networks, including minimizing the number of computations and mini-batching. Then I'll introduce our recently proposed method for automatic batching in dynamic networks, which makes it much easier to implement complicated networks efficiently. Code examples for the implementation will be provided.

Graham Neubig
Sat 2:45 p.m. - 3:05 p.m. [iCal]
Learning Texture Manifolds with the Periodic Spatial GAN by Nikolay Jetchev , Zalando (Demonstration)
Roland Vollgraf
Sat 3:05 p.m. - 3:40 p.m. [iCal]
MLPACK, A case study: implementing ID3 decision trees to be as fast as possible (Keynote)
Ryan Curtin
Sat 3:40 p.m. - 4:00 p.m. [iCal]
Self-Normalizing Neural Networks (Demonstration)
Tom Unterthiner
Sat 4:00 p.m. - 4:25 p.m. [iCal]
Best of Both Worlds: Transferring Knowledge from Discriminative Learning to a Generative Visual Dialog Mode (Demonstration)
Jiasen Lu
Sat 4:25 p.m. - 4:55 p.m. [iCal]
Break
Sat 5:00 p.m. - 6:00 p.m. [iCal]

Ben Athiwaratkun: Bayesian GAN in Pytorch,

Dhyani Dushyanta: A Convolutional Encoder Model for Neural Machine Translation,

Forough Arabshahi: Combining Symbolic Expressions and Black-box Function Evaluations in Neural Programs,

Jean Kossaifi: Tensor Regression Networks with TensorLy and MXNet,

Joseph Paul Cohen: ShortScience.org - Reproducing Intuition,

Kamyar Azizzadenesheli: Efficient Exploration through Bayesian Deep Q-Networks,

Ashish Khetan: Learning from noisy, single-labeled data,

Rose Yu: Long-Term Forecasting using Tensor-Train RNNs

Shayenne da Lu Moura: Melody Transcription System

Tschannen Michael: Fast Linear Algebra in Stacked Strassen Networks

Yang Shi: Multimodal Compact Bilinear Pooling for Visual Question Answering

Yu-Chia Chen: Improved Graph Laplacian via Geometric Consistency

Author Information

Alex Dimakis (University of Texas, Austin)
Nikolaos Vasiloglou (RelationalAI)
Guy Van den Broeck (UCLA)

I am an Assistant Professor and Samueli Fellow at UCLA, in the Computer Science Department, where I direct the Statistical and Relational Artificial Intelligence (StarAI) lab. My research interests are in Machine Learning (Statistical Relational Learning, Tractable Learning), Knowledge Representation and Reasoning (Graphical Models, Lifted Probabilistic Inference, Knowledge Compilation), Applications of Probabilistic Reasoning and Learning (Probabilistic Programming, Probabilistic Databases), and Artificial Intelligence in general.

Alexander Ihler (UC Irvine)
Assaf Araki (Intel)

More from the Same Authors