Abstract:
Nonsmooth phenomena are common in optimization—from constraints, solution maps, regularizers, and conditional code. Classical tools in variational analysis often rely on qualification conditions that are hard to check and rarely met in computational settings.
This talk presents conservative calculus: a qualification-free framework aligned with algorithmic implementation and compatible with automatic differentiation libraries (PyTorch, JAX, TensorFlow). For functions given by analytic or polynomial formulas, including implicit definitions, subgradients can be composed via a chain rule to form a conservative Jacobian.
Mathematically, o-minimal (e.g., semi-algebraic) locally Lipschitz functions fit this notion; informally, many finite-dimensional locally Lipschitz functions built from standard operations (+, ×, exp, cos, min/max, argmin/argmax, …) do as well.
Applications include training deep networks (explicit and implicit), differentiating solution maps, and differentiating algorithms.