Standard Article

Gradient-Type Methods

  1. Foad Mahdavi Pajouh,
  2. Balabhaskar Balasundaram

Published Online: 15 FEB 2011

DOI: 10.1002/9780470400531.eorms0363

Wiley Encyclopedia of Operations Research and Management Science

Wiley Encyclopedia of Operations Research and Management Science

How to Cite

Pajouh, F. M. and Balasundaram, B. 2011. Gradient-Type Methods. Wiley Encyclopedia of Operations Research and Management Science. .

Author Information

  1. Oklahoma State University, School of Industrial Engineering and Management, Stillwater, Oklahoma

Publication History

  1. Published Online: 15 FEB 2011

Abstract

The gradient method, which is also called the method of steepest descent, and the Cauchy method, is one of the most fundamental derivative-based procedure for unconstrained minimization of a differentiable function. The performance of the method in terms of speed of convergence is lacking, and it tends to suffer from very slow convergence, especially as a stationary point is approached. However, it does guarantee global convergence under reasonable conditions and admits a thorough mathematical analysis of its behavior. For this reason, the gradient method has been used as a starting point in the development of more sophisticated, globally convergent algorithms with better convergence properties for unconstrained minimization. This article presents a cogent overview of this fundamental method and its convergence properties under various settings.

Keywords:

  • unconstrained optimization;
  • gradient method;
  • method of steepest descent;
  • Cauchy method;
  • subgradient method