.. index:: pair: group; Layer Normalization .. _doxid-group__dnnl__api__layer__normalization: Layer Normalization =================== .. toctree:: :hidden: struct_dnnl_layer_normalization_desc_t.rst struct_dnnl_layer_normalization_backward.rst struct_dnnl_layer_normalization_forward.rst Overview ~~~~~~~~ A primitive to perform layer normalization. :ref:`More...` .. ref-code-block:: cpp :class: doxyrest-overview-code-block // structs struct :ref:`dnnl_layer_normalization_desc_t`; struct :ref:`dnnl::layer_normalization_backward`; struct :ref:`dnnl::layer_normalization_forward`; // global functions :ref:`dnnl_status_t` DNNL_API :ref:`dnnl_layer_normalization_forward_desc_init`( :ref:`dnnl_layer_normalization_desc_t`* lnrm_desc, :ref:`dnnl_prop_kind_t` prop_kind, const :ref:`dnnl_memory_desc_t`* data_desc, const :ref:`dnnl_memory_desc_t`* stat_desc, float epsilon, unsigned flags ); :ref:`dnnl_status_t` DNNL_API :ref:`dnnl_layer_normalization_backward_desc_init`( :ref:`dnnl_layer_normalization_desc_t`* lnrm_desc, :ref:`dnnl_prop_kind_t` prop_kind, const :ref:`dnnl_memory_desc_t`* diff_data_desc, const :ref:`dnnl_memory_desc_t`* data_desc, const :ref:`dnnl_memory_desc_t`* stat_desc, float epsilon, unsigned flags ); .. _details-group__dnnl__api__layer__normalization: Detailed Documentation ~~~~~~~~~~~~~~~~~~~~~~ A primitive to perform layer normalization. Normalization is performed within the last logical dimension of data tensor. Both forward and backward propagation primitives support in-place operation; that is, src and dst can refer to the same memory for forward propagation, and diff_dst and diff_src can refer to the same memory for backward propagation. The layer normalization primitives computations can be controlled by specifying different :ref:`dnnl::normalization_flags ` values. For example, layer normalization forward propagation can be configured to either compute the mean and variance or take them as arguments. It can either perform scaling and shifting using gamma and beta parameters or not. Optionally, it can also perform a fused ReLU, which in case of training would also require a workspace. .. rubric:: See also: :ref:`Layer Normalization ` in developer guide Global Functions ---------------- .. index:: pair: function; dnnl_layer_normalization_forward_desc_init .. _doxid-group__dnnl__api__layer__normalization_1gafc38935e49742897454f5009ea9a10f1: .. ref-code-block:: cpp :class: doxyrest-title-code-block :ref:`dnnl_status_t` DNNL_API dnnl_layer_normalization_forward_desc_init( :ref:`dnnl_layer_normalization_desc_t`* lnrm_desc, :ref:`dnnl_prop_kind_t` prop_kind, const :ref:`dnnl_memory_desc_t`* data_desc, const :ref:`dnnl_memory_desc_t`* stat_desc, float epsilon, unsigned flags ) Initializes a descriptor for layer normalization forward propagation primitive. .. note:: In-place operation is supported: the dst can refer to the same memory as the src. .. rubric:: Parameters: .. list-table:: :widths: 20 80 * - lnrm_desc - Output descriptor for layer normalization primitive. * - prop_kind - Propagation kind. Possible values are :ref:`dnnl_forward_training ` and :ref:`dnnl_forward_inference `. * - data_desc - Source and destination memory descriptor. * - stat_desc - Memory descriptor for mean and variance. If this parameter is NULL, a zero memory descriptor, or a memory descriptor with format_kind set to :ref:`dnnl_format_kind_undef `, then the memory descriptor for stats is derived from ``data_desc`` by removing the last dimension. * - epsilon - Layer normalization epsilon parameter. * - flags - Layer normalization flags (:ref:`dnnl_normalization_flags_t `). .. rubric:: Returns: :ref:`dnnl_success ` on success and a status describing the error otherwise. .. index:: pair: function; dnnl_layer_normalization_backward_desc_init .. _doxid-group__dnnl__api__layer__normalization_1gaca22d67e529c86a61fbfa1571d19ca4f: .. ref-code-block:: cpp :class: doxyrest-title-code-block :ref:`dnnl_status_t` DNNL_API dnnl_layer_normalization_backward_desc_init( :ref:`dnnl_layer_normalization_desc_t`* lnrm_desc, :ref:`dnnl_prop_kind_t` prop_kind, const :ref:`dnnl_memory_desc_t`* diff_data_desc, const :ref:`dnnl_memory_desc_t`* data_desc, const :ref:`dnnl_memory_desc_t`* stat_desc, float epsilon, unsigned flags ) Initializes a descriptor for a layer normalization backward propagation primitive. .. note:: In-place operation is supported: the diff_dst can refer to the same memory as the diff_src. .. rubric:: Parameters: .. list-table:: :widths: 20 80 * - lnrm_desc - Output descriptor for layer normalization primitive. * - prop_kind - Propagation kind. Possible values are :ref:`dnnl_backward_data ` and :ref:`dnnl_backward ` (diffs for all parameters are computed in this case). * - diff_data_desc - Diff source and diff destination memory descriptor. * - data_desc - Source memory descriptor. * - stat_desc - Memory descriptor for mean and variance. If this parameter is NULL, a zero memory descriptor, or a memory descriptor with format_kind set to :ref:`dnnl_format_kind_undef `, then the memory descriptor for stats is derived from ``data_desc`` by removing the last dimension. * - epsilon - Layer normalization epsilon parameter. * - flags - Layer normalization flags (:ref:`dnnl_normalization_flags_t `). .. rubric:: Returns: :ref:`dnnl_success ` on success and a status describing the error otherwise.