See Also: Unconstrained Optimization Nonlinear Least-Squares Problems
In some applications, it may be necessary to place the bound constraints \(l \leq x \leq u\) on the variables \(x\). The resulting problem can be solved with the methods for bound constrained problems, possibly modified to take advantage of the special Hessian approximations that are available for nonlinear least squares problems. Active set methods for handling the bounds form part of the capability of the DFNLP, IMSL, LANCELOT, NAG (FORTRAN), NAG (C), NLSSOL, PORT 3, and VE10 codes. An approach based on the gradient-projection method, which is more suitable for large-scale applications, is used by the LANCELOT and VE10 codes.
The PROC NLP codes can be used to solve problems with general linear constraints. The algorithms use active set versions of the Levenberg-Marquardt method as well as of the hybrid strategy that combines the Gauss-Newton and BFGS quasi-Newton algorithms.
The DFNLP and NLSSOL codes can find minimizers of \(r\) subject to general nonlinear constraints. The NLSSOL code uses the same sequential quadratic programming strategy as the general nonlinear programming code NPSOL, but it makes use of the Jacobian matrix \(f^\prime(x)\) to compute a starting approximation to the Hessian of the Lagrangian for the constrained problem and to calculate the gradient \(\nabla r\). DFNLP also makes use of sequential quadratic programming techniques while exploiting the structure of \(r\) in its choice of approximate Hessian.