Timezone: »

On the Expressive Power of Restricted Boltzmann Machines
James Martens · Arkadev Chattopadhya · Toni Pitassi · Richard Zemel

Fri Dec 06 07:00 PM -- 11:59 PM (PST) @ Harrah's Special Events Center, 2nd Floor #None

This paper examines the question: What kinds of distributions can be efficiently represented by Restricted Boltzmann Machines (RBMs)? We characterize the RBM's unnormalized log-likelihood function as a type of neural network (called an RBM network), and through a series of simulation results relate these networks to types that are better understood. We show the surprising result that RBM networks can efficiently compute any function that depends on the number of 1's in the input, such as parity. We also provide the first known example of a particular type of distribution which provably cannot be efficiently represented by an RBM (or equivalently, cannot be efficiently computed by an RBM network), assuming a realistic exponential upper bound on the size of the weights. By formally demonstrating that a relatively simple distribution cannot be represented efficiently by an RBM our results provide a new rigorous justification for the use of potentially more expressive generative models, such as deeper ones.

Author Information

James Martens (University of Toronto)
Arkadev Chattopadhya (Tata Institute of Fundamental Research)
Toni Pitassi (University of Toronto)
Richard Zemel (Vector Institute/University of Toronto)

More from the Same Authors