`

Timezone: »

 
Iterated Vector Fields and Conservatism, with Applications to Federated Learning
Zachary Charles · John Rush

We study when iterated vector fields (vector fields composed with themselves) are conservative. We give explicit examples of vector fields for which this self-composition preserves conservatism. Notably, this includes gradient vector fields of loss functions associated to some generalized linear models (including non-convex functions). As we show, characterizing the set of smooth vector fields satisfying this condition yields non-trivial geometric questions. In the context of federated learning, we show that when clients have loss functions whose gradient satisfies this condition, federated averaging is equivalent to gradient descent on a surrogate loss function. We leverage this to derive novel convergence results for federated learning. By contrast, we demonstrate that when the client losses violate this property, federated averaging can yield behavior which is fundamentally distinct from centralized optimization. Finally, we discuss theoretical and practical questions our analytical framework raises for federated learning.

Author Information

Zachary Charles (Google Research)
John Rush (Google)

I come from a pure mathematics background, formerly a harmonic analyst and mathematical physicist. I transferred to machine learning on the software side after grad school, and joined Google in 2018, working on federated learning. I am a main author of TensorFlow Federated; ask me about it!

More from the Same Authors