Published Online: 15 FEB 2011
Copyright © 2010 John Wiley & Sons, Inc. All rights reserved.
Wiley Encyclopedia of Operations Research and Management Science
How to Cite
Pajouh, F. M. and Balasundaram, B. 2011. Gradient-Type Methods. Wiley Encyclopedia of Operations Research and Management Science. .
- Published Online: 15 FEB 2011
The gradient method, which is also called the method of steepest descent, and the Cauchy method, is one of the most fundamental derivative-based procedure for unconstrained minimization of a differentiable function. The performance of the method in terms of speed of convergence is lacking, and it tends to suffer from very slow convergence, especially as a stationary point is approached. However, it does guarantee global convergence under reasonable conditions and admits a thorough mathematical analysis of its behavior. For this reason, the gradient method has been used as a starting point in the development of more sophisticated, globally convergent algorithms with better convergence properties for unconstrained minimization. This article presents a cogent overview of this fundamental method and its convergence properties under various settings.
- unconstrained optimization;
- gradient method;
- method of steepest descent;
- Cauchy method;
- subgradient method