Fixed-Order Lexicographic Optimization via the $\lambda$-ladder Exponential Loss
Diego Calanzone · Arielle GazzĂ© · Pierre-Luc Bacon
Abstract
We revisit fixed-order lexicographic optimization from a constrained-optimization perspective and show how to eliminate the stagewise constraints altogether. The textbook implementation is a cascade of constrained problems (the $\varepsilon$-constraint method) that preserves higher-priority optimality via feasibility tolerances. We propose a simple, differentiable *constraint-elimination reduction*: the $\lambda$-ladder exponential loss. With a user-specified order, a sharpness $c$, and a target gap $\delta$, our construction replaces the constrained cascade by a *single unconstrained objective* whose minimizers approach lexicographic solutions as $c$ grows. We provide a leakage-to-parameter rule linking $(c,\delta)$ to an upper bound on lower-priority influence, relate the construction to classical hard-gap single-objective reductions for linear objectives, and give numerically stable log-domain evaluations. The result is a constraint-free pipeline that reproduces the intent of lexicographic constraints without solving sequential constrained subproblems.
Chat is not available.
Successful Page Load