•  
  •  
 

Abstract

The conjugate gradient (CG) technique is an efficient method for solving nonlinear optimization problems, especially for large scale problems. This is due to simplicity, low memory requirement, and computational cost, which satisfy the sufficient descent condition. This paper presents a new modification of the PRP method which can be considered as a convex combination of PRP and DPRP methods. The performance of the new method is analyzed to ensure the descent property under exact line search. Numerical experiments indicate that this method has good convergence performance and is promising.

Share

COinS