Deep Neural Network Library (DNNL)  1.1.3
Performance library for Deep Learning
Classes

A primitive to perform layer normalization. More...

Classes

struct  dnnl::layer_normalization_forward
 layer normalization for forward propagation. More...
 
struct  dnnl::layer_normalization_backward
 layer normalization backward propagation. More...
 

Detailed Description

A primitive to perform layer normalization.

Normalization is performed over the last logical axis of data tensor.

Both forward and backward passes support in-place operation; that is, src and dst point to the same memory for forward pass, and diff_dst and diff_src point to the same memory for backward pass.

layer normalization supports different flavors controlled by dnnl_layer_normalization_desc_t. For example, layer normalization can compute the mean and variance on its own or take them as inputs. It can either perform scaling and shifting using gamma and beta parameters or not. Optionally, it can also perform a fused ReLU, which in case of training would also require a workspace.

See also
Layer Normalization in developer guide
Layer Normalization in C API