Presented By: Department of Mathematics
Variational Analysis and Optimization Seminar
The Fast Augmented Lagrangian Method in continuous and discrete time
In this talk, we address the minimization of a continuously differentiable convex function under linear equality constraints.
First, we consider a second-order dynamical system with asymptotically vanishing damping term formulated in terms of the Augmented Lagrangian associated with the minimization problem. We show fast convergence of the primal-dual gap, the feasibility measure, and the objective function value along the generated trajectories, and also asymptotic weak convergence of the primal-dual trajectory to a primal-dual optimal solution of the underlying minimization problem.
By appropriate time discretization of the dynamical system we derive an inertial algorithm with a general rule for the inertial parameters that covers the three classical rules by Nesterov, Chambolle-Dossal and Attouch-Cabot used in the literature in the context of fast gradient methods. We show that the algorithm exhibits in the convex regime convergence rates of order $O(1/k^2)$ for the primal-dual gap, the feasibility measure, and the objective function value. In addition, for the Chambolle-Dossal and Attouch-Cabot rules, the generated sequence of primal-dual iterates converges weakly to a primal-dual optimal solution.
For the unconstrained minimization of a convex differentiable function we rediscover all convergence statements obtained in the literature for Nesterov's accelerated gradient method.
Please see the slides in the attached file below.
Zoom Link:
https://umich.zoom.us/j/95524251106?pwd=TUN5SnlVQzB5bGRtejZRU2NhVXpJUT09
Meeting ID: 955 2425 1106
Passcode: 491904 Speaker(s): Radu Bot (University of Vienna, Austria)
First, we consider a second-order dynamical system with asymptotically vanishing damping term formulated in terms of the Augmented Lagrangian associated with the minimization problem. We show fast convergence of the primal-dual gap, the feasibility measure, and the objective function value along the generated trajectories, and also asymptotic weak convergence of the primal-dual trajectory to a primal-dual optimal solution of the underlying minimization problem.
By appropriate time discretization of the dynamical system we derive an inertial algorithm with a general rule for the inertial parameters that covers the three classical rules by Nesterov, Chambolle-Dossal and Attouch-Cabot used in the literature in the context of fast gradient methods. We show that the algorithm exhibits in the convex regime convergence rates of order $O(1/k^2)$ for the primal-dual gap, the feasibility measure, and the objective function value. In addition, for the Chambolle-Dossal and Attouch-Cabot rules, the generated sequence of primal-dual iterates converges weakly to a primal-dual optimal solution.
For the unconstrained minimization of a convex differentiable function we rediscover all convergence statements obtained in the literature for Nesterov's accelerated gradient method.
Please see the slides in the attached file below.
Zoom Link:
https://umich.zoom.us/j/95524251106?pwd=TUN5SnlVQzB5bGRtejZRU2NhVXpJUT09
Meeting ID: 955 2425 1106
Passcode: 491904 Speaker(s): Radu Bot (University of Vienna, Austria)