See Also: Unconstrained Optimization
The line search and trust-region techniques are suitable if the number of variables \(n\) is not too large, since the cost per iteration is of order \(n^3\). Implemented algorithms for problems with a large number of variables tend to use iterative techniques for obtaining a direction \(d_k\) in a line-search method or a step \(s_k\) in a trust-region method. These techniques are usually called truncated Newton methods because the iterative technique is stopped (truncated) as soon as a termination criterion is satisfied.
For example, some algorithms use a line search method in which the direction \(d_k\) satisfies
\[\parallel \nabla^2 f(x_k) d_k + \nabla f(x_k) \parallel \leq \eta_k \parallel \nabla f(x_k) \parallel \]
for some \(\eta_k \in (0,1)\).
A similar idea can be used in the context of trust-region methods. Conjugate gradient algorithms mesh well with truncated Newton methods because of their desirable numerical properties. Preconditioning is necessary in order to improve the efficiency and reliability of the conjugate gradient method; effective preconditioners include those based on the incomplete Cholesky factorization and symmetric successive overrelaxation.