# Optimizers¶

This chapter describes optimizers implemented in oneDAL.

# Newton-CG Optimizer¶

The Newton-CG optimizer minimizes the convex function iteratively using its gradient and hessian-product operator.

 Operation Computational methods dense dense

## Mathematical Formulation¶

### Computing¶

The Newton-CG optimizer, also known as the hessian-free optimizer, minimizes convex functions without calculating the Hessian matrix. Instead, it uses a Hessian product matrix operator. In the Newton method, the descent direction is calculated using the formula $$d_k = -H_k^{-1} g_k$$, where $$g_k, H_k$$ are the gradient and hessian matrix of the loss function on the $$k$$-th iteration. The Newton-CG method uses the Conjugate Gradients solver to find the approximate solution to the equation $$H_k d_k = -g_k$$. This solver can find solutions to the system of linear equations $$Ax = b$$ taking vector $$b$$ and functor $$f(p) = Ap$$ as input.

### Computation Method: dense¶

The method defines the Newton-CG optimizer used by other algorithms for convex optimization. There is no separate computation mode to minimize a function manually.

## Programming Interface¶

Refer to API Reference: Newton-CG optimizer.