In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given proving that, in some sense, the proposed preconditioners also approximate the inverse of the Hessian matrix. In particular, the structure of the preconditioners depends both on low-rank updates along with some specific parameters. The low-rank updates are obtained as by-product of NCG iterations. Moreover, we consider the possibility to embed damped techniques within a class of preconditioners based on quasi-Newton updates. Damped methods have proved to be effective to enhance the performance of quasi- Newton updates, in those cases where the Wolfe linesearch conditions are hardly fulfilled. The purpose is to extend the idea behind damped methods also to improve NCG schemes, following a novel line of research in the literature. The results, which summarize an extended numerical experience using large-scale CUTEst problems, is reported, showing that these approaches can considerably improve the performance of NCG methods.
Dettaglio pubblicazione
2018, Numerical analysis and optimization, Pages 1-21 (volume: 235)
Quasi-Newton based preconditioning and damped quasi-Newton schemes for nonlinear conjugate gradient methods (04b Atto di convegno in volume)
Al-Baali Mehiddin, Caliciotti Andrea, Fasano Giovanni, Roma Massimo
ISBN: 978-3-319-90025-4; 978-3-319-90026-1
Gruppo di ricerca: Continuous Optimization
keywords