`

Timezone: »

 
Spotlight
Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning
ali rahimi · Benjamin Recht

Tue Dec 09 05:24 PM -- 05:25 PM (PST) @ None

Randomized neural networks are immortalized in this AI Koan: In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6. What are you doing?'' asked Minsky.I am training a randomly wired neural net to play tic-tac-toe,'' Sussman replied. Why is the net wired randomly?'' asked Minsky. Sussman replied,I do not want it to have any preconceptions of how to play.'' Minsky then shut his eyes. Why do you close your eyes?'' Sussman asked his teacher.So that the room will be empty,'' replied Minsky. At that moment, Sussman was enlightened. We analyze shallow random networks with the help of concentration of measure inequalities. Specifically, we consider architectures that compute a weighted sum of their inputs after passing them through a bank of arbitrary randomized nonlinearities. We identify conditions under which these networks exhibit good classification performance, and bound their test error in terms of the size of the dataset and the number of random nonlinearities.

Author Information

ali rahimi (Intel)
Benjamin Recht (California Institute of Technology)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors