Language:   Search:   Contact
Zentralblatt MATH has released its new interface!
For an improved author identification, see the new author database of ZBMATH.

Query:
Fill in the form and click »Search«...
Format:
Display: entries per page entries
Zbl 1007.90065
Dai, Y.H.; Yuan, Y.
An efficient hybrid conjugate gradient method for unconstrained optimization.
(English)
[J] Ann. Oper. Res. 103, 33-47 (2001). ISSN 0254-5330; ISSN 1572-9338/e

Summary: Recently, we propose a nonlinear conjugate gradient method, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the weak Wolfe conditions. In this paper, we will study methods related to the new nonlinear conjugate gradient method. Specifically, if the size of the scalar $\beta_k$ with respect to the one in the new method belongs to some interval, then the corresponding methods are proved to be globally convergent; otherwise, we are able to construct a convex quadratic example showing that the methods need not converge. Numerical experiments are made for two combinations of the new method and the Hestenes-Stiefel conjugate gradient method. The initial results show that, one of the hybrid methods is especially efficient for the given test problems.
MSC 2000:
*90C30 Nonlinear programming
49M37 Methods of nonlinear programming type
65K05 Mathematical programming (numerical methods)

Keywords: unconstrained optimization; conjugate gradient method; line search; descent property; global convergence

Highlights
Master Server