We consider nonconvex constrained optimization problems and propose a new approach to the convergence analysis based on penalty functions. We make use of classical penalty functions in an unconventional way, in that penalty functions only enter in the theoretical analysis of convergence while the algorithm itself is penalty free. Based on this idea, we are able to establish several new results, including the first general analysis for diminishing stepsize methods in nonconvex, constrained optimization, showing convergence to generalized stationary points, and a complexity study for sequential quadratic programming–type algorithms.
Facchinei, F., Kungurtsev, V., Lampariello, L., Scutari, G. (2021). Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity. MATHEMATICS OF OPERATIONS RESEARCH, 46(2), 595-627 [10.1287/moor.2020.1079].
Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity
Lampariello L.;
2021-01-01
Abstract
We consider nonconvex constrained optimization problems and propose a new approach to the convergence analysis based on penalty functions. We make use of classical penalty functions in an unconventional way, in that penalty functions only enter in the theoretical analysis of convergence while the algorithm itself is penalty free. Based on this idea, we are able to establish several new results, including the first general analysis for diminishing stepsize methods in nonconvex, constrained optimization, showing convergence to generalized stationary points, and a complexity study for sequential quadratic programming–type algorithms.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.