Each Oral includes Q&A
Spotlights have joint Q&As
[6:00]
Hogwild!: A Lock-Free Approach to Parallelizing Stochastic Gradient Descent
[6:15]
Entropic Optimal Transport between Unbalanced Gaussian Measures has a Closed Form
[6:30]
Acceleration with a Ball Optimization Oracle
[6:45]
Convex optimization based on global lower second-order models
[7:00]
Adam with Bandit Sampling for Deep Learning
[7:10]
Explore Aggressively, Update Conservatively: Stochastic Extragradient Methods with Variable Stepsize Scaling
[7:20]
IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method
[7:30]
Revisiting Frank-Wolfe for Polytopes: Strict Complementarity and Sparsity
[7:40]
Joint Q&A for Preceeding Spotlights
[7:50]
Minibatch Stochastic Approximate Proximal Point Methods
[8:00]
Finding Second-Order Stationary Points Efficiently in Smooth Nonconvex Linearly Constrained Optimization Problems
[8:10]
Least Squares Regression with Markovian Data: Fundamental Limits and Algorithms
[8:20]
Linearly Converging Error Compensated SGD
[8:30]
Learning Augmented Energy Minimization via Speed Scaling
[8:40]
Joint Q&A for Preceeding Spotlights