Skip to yearly menu bar Skip to main content


Beyond BackPropagation: Novel Ideas for Training Neural Architectures

Mateusz Malinowski · Grzegorz Swirszcz · Viorica Patraucean · Marco Gori · Yanping Huang · Sindy Löwe · Anna Choromanska

Is backpropagation the ultimate tool on the path to achieving synthetic intelligence as its success and widespread adoption would suggest?

Many have questioned the biological plausibility of backpropagation as a learning mechanism since its discovery. The weight transport and timing problems are the most disputable. The same properties of backpropagation training also have practical consequences. For instance, backpropagation training is a global and coupled procedure that limits the amount of possible parallelism and yields high latency.

These limitations have motivated us to discuss possible alternative directions. In this workshop, we want to promote such discussions by bringing together researchers from various but related disciplines, and to discuss possible solutions from engineering, machine learning and neuroscientific perspectives.

Chat is not available.
Timezone: America/Los_Angeles