Nonlinear Rescaling as Interior Quadratic Prox Method in Convex Optimization

Author: Polyak Roman  

Publisher: Springer Publishing Company

ISSN: 0926-6003

Source: Computational Optimization and Applications, Vol.35, Iss.3, 2006-11, pp. : 347-373

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Previous Menu Next

Abstract

A class Ψ of strictly concave and twice continuously differentiable functions ψ: RR with particular properties is used for constraint transformation in the framework of a Nonlinear Rescaling (NR) method with “dynamic” scaling parameter updates. We show that the NR method is equivalent to the Interior Quadratic Prox method for the dual problem in a rescaled dual space.The equivalence is used to prove convergence and to estimate the rate of convergence of the NR method and its dual equivalent under very mild assumptions on the input data for a wide class Ψ of constraint transformations. It is also used to estimate the rate of convergence under strict complementarity and under the standard second order optimality condition.We proved that for any ψ ∈ Ψ, which corresponds to a well-defined dual kernel Φ = −ψ*, the NR method applied to LP generates a quadratically convergent dual sequence if the dual LP has a unique solution.