A Modified Conjugate Gradient Method for Unconstrained Optimization

Can Li

Abstract


Conjugate gradient methods are an important class of methods for solving unconstrained optimization problems, especially for large-scale problems. Recently, they have been much studied. In this paper, we further study the conjugate gradient method for unconstrained optimization. We focus our attention to the descent conjugate gradient method. This paper presents a modified conjugate gradient method. An interesting feature of the presented method is that the direction is always a descent direction for the objective function. Moreover, the property is independent of the line search used. Under mild conditions, we prove that the modified conjugate gradient method with Armijo-type line search is globally convergent. We also present some numerical results to show the efficiency of the proposed method.

 

DOI:  http://dx.doi.org/10.11591/telkomnika.v11i11.2894 


Full Text:

PDF

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License