Language:   Search:   Contact
Zentralblatt MATH has released its new interface!
For an improved author identification, see the new author database of ZBMATH.

Query:
Fill in the form and click »Search«...
Format:
Display: entries per page entries
Zbl 0957.65061
Dai, Y.H.; Yuan, Y.
A nonlinear conjugate gradient method with a strong global convergence property.
(English)
[J] SIAM J. Optim. 10, No.1, 177-182 (1999). ISSN 1052-6234; ISSN 1095-7189/e

The paper presents a new version of the conjugate gradient method $x_{k+1}=x_k +\alpha_k d_k, d_{k+1}=-f''(x_{k+1})+ \beta_k d_k, \beta_k=\|f''(x_{k+1})\|^2/ d_k^T (f''(x_{k+1})-f''(x_k))$ for solving unconstrained optimization problem $\min_{x\in \Bbb R^n} f(x)$. Unlike the classical stepsize rule $\alpha_k =\text{argmin}\{ f(x_k+\alpha d_k): \alpha \geq 0 \}$ the authors analyse a scheme in which $\alpha_k$ is chosen arbitrarily subject to the conditions $f(x_k)-f(x_k+\alpha_k d_k) \geq -\delta \alpha_k f^{\prime T}(x_k) d_k$ and $f^{\prime T}(x_k+\alpha_k d_k) d_k > \sigma f^{\prime T}(x_k) d_k; 0<\delta <\sigma <1$. The convergence $f''(x_k) \to 0$ is established provided that $f(x)$ is bounded below on $N=\{x \in \Bbb R^n: f(x) \leq f(x_1)\}$ and $f''(x)$ is Lipschitz continuous on $N$. The boundedness of $N$ is not assumed.
[Mikhail Yu.Kokurin (Yoshkar-Ola)]
MSC 2000:
*65K05 Mathematical programming (numerical methods)
90C30 Nonlinear programming
90C53 Methods of quasi-Newton type

Keywords: unconstrained optimization; conjugate gradient method; Wolfe conditions; global convergence

Highlights
Master Server