See Also: Unconstrained Optimization Nonlinear Optimization
Tensor methods go a step beyond Newton’s method by including second-order derivative information from \(f\) into the model function. For problems with a dense Jacobian matrix, the storage and the cost of the linear algebra operations increase only marginally over Newton’s method. The tensor method typically converges more rapidly than Newton’s method, particularly when the Jacobian is singular at the solution \(x^*\).
Schnabel, R. B. and Frank, P. D. 1984. Tensor methods for nonlinear equations. SIAM Journal on Numerical Analysis 21, pp. 815-843.