Poster
Feedforward Learning of Mixture Models
Matthew Lawlor · Steven W Zucker

Tue Dec 9th 07:00 -- 11:59 PM @ Level 2, room 210D #None

We develop a biologically-plausible learning rule that provably converges to the class means of general mixture models. This rule generalizes the classical BCM neural rule within a tensor framework, substantially increasing the generality of the learning problem it solves. It achieves this by incorporating triplets of samples from the mixtures, which provides a novel information processing interpretation to spike-timing-dependent plasticity. We provide both proofs of convergence, and a close fit to experimental data on STDP.

Author Information

Matthew Lawlor (Yale University)
Steven W Zucker (Yale University)

More from the Same Authors