Standard Article

Optimization and Nonlinear Equations

  1. Gordon K. Smyth

Published Online: 15 JUL 2005

DOI: 10.1002/0470011815.b2a14027

Encyclopedia of Biostatistics

Encyclopedia of Biostatistics

How to Cite

Smyth, G. K. 2005. Optimization and Nonlinear Equations. Encyclopedia of Biostatistics. 6.

Author Information

  1. Walter and Eliza Hall Institute of Medical Research, Melbourne, Victoria, Australia

Publication History

  1. Published Online: 15 JUL 2005


Optimization means finding that argument which minimizes or maximizes a given function such as a likelihood function or a sum of squares. Many optimization algorithms are derived from algorithms that solve the nonlinear equations defined by setting the derivative of the objective function equal to zero. This article discusses a number of methods for unconstrained optimization, including bisection and golden search in the univariate case and Newton's method and quasi-Newton algorithms in the multivariate case. Applications to maximum likelihood estimation, Fisher's method of scoring, nonlinear regression, and generalized linear models are described. Restricted step modifications are discussed for preventing divergence of Newton-type algorithms. A number of derivative-free methods are discussed, including the Nelder–Mead simplex method, methods with numeric derivatives, conjugate methods, and the EM algorithm.


  • bisection;
  • golden search;
  • Newton's method;
  • quasi-Newton algorithm;
  • Gauss–Newton algorithm;
  • Fisher's method of scoring;
  • Nelder–Mead algorithm;
  • EM algorithm