Deep Neural Network Library (DNNL)  1.1.3
Performance library for Deep Learning
Classes

A primitive to perform batch normalization. More...

Classes

struct  dnnl::batch_normalization_forward
 Batch normalization for forward propagation. More...
 
struct  dnnl::batch_normalization_backward
 Batch normalization backward propagation. More...
 

Detailed Description

A primitive to perform batch normalization.

Both forward and backward passes support in-place operation; that is, src and dst point to the same memory for forward pass, and diff_dst and diff_src point to the same memory for backward pass.

Batch normalization supports different flavors controlled by dnnl_batch_normalization_desc_t. For example, batch normalization can compute the mean and variance on its own or take them as inputs. It can either perform scaling and shifting using gamma and beta parameters or not. Optionally, it can also perform a fused ReLU, which in case of training would also require a workspace.

See also
Batch Normalization in developer guide
Batch Normalization in C API