SEARCH

SEARCH BY CITATION

Abstract

Maximum likelihood fit of nonlinear, implicit, multiple-response models to data containing normally distributed random errors can be carried out by a combination of the Gauss-Newton generalized nonlinear least-square algorithm first described by Britt and Luecke in 1973, with a Fletcher-Reeves conjugate gradient search for initial parameter estimates. The convergence of the algorithm is further improved by adding a step-limiting procedure that ensures a reduction in the objective function for each iteration. Multiple-equation regression methods appropriate to the solution of explicit fixed-regressor models are derived from this general treatment as special cases. These include weighted nonlinear least squares (where the covariance matrix of the response is known), and uniformly weighted nonlinear least squares (where the responses are uncorrelated and characterized by a single common variance). Alternative methods for fixed-regressor fits of explicit multiequation models with an unknown covariance matrix of the responses are also considered. The moment-matrix determinant criterion appropriate in such situations is also efficiently minimized by use of the conjugate-gradient algorithm, which is considerably less sensitive to the accuracy of the initial parameter estimate than the more usual Gauss-Newton methods. The performance of the new algorithm for models defined by one, two, and three implicit functional constraints per point is illustrated by random-regressor fits of isothermal p−X and p−X−Y vapor–liquid equilibrium data, and ternary liquid–liquid equilibrium data, respectively.