Timezone: »

 
Poster
Over-complete representations on recurrent neural networks can support persistent percepts
Shaul Druckmann · Dmitri B Chklovskii

Tue Dec 07 12:00 AM -- 12:00 AM (PST) @

A striking aspect of cortical neural networks is the divergence of a relatively small number of input channels from the peripheral sensory apparatus into a large number of cortical neurons, an over-complete representation strategy. Cortical neurons are then connected by a sparse network of lateral synapses. Here we propose that such architecture may increase the persistence of the representation of an incoming stimulus, or a percept. We demonstrate that for a family of networks in which the receptive field of each neuron is re-expressed by its outgoing connections, a represented percept can remain constant despite changing activity. We term this choice of connectivity REceptive FIeld REcombination (REFIRE) networks. The sparse REFIRE network may serve as a high-dimensional integrator and a biologically plausible model of the local cortical circuit.

Author Information

Shaul Druckmann (Janelia Farm Research Campus)
Dmitri B Chklovskii (HHMI)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors