enum dnnl::normalization_flags¶
Overview¶
Flags for normalization primitives. More…
#include <dnnl.hpp> enum normalization_flags { none = dnnl_normalization_flags_none, use_global_stats = dnnl_use_global_stats, use_scale = dnnl_use_scale, use_shift = dnnl_use_shift, fuse_norm_relu = dnnl_fuse_norm_relu, fuse_norm_add_relu = dnnl_fuse_norm_add_relu, };
Detailed Documentation¶
Flags for normalization primitives.
Enum Values¶
none
Use no normalization flags.
If specified, the library computes mean and variance on forward propagation for training and inference, outputs them on forward propagation for training, and computes the respective derivatives on backward propagation.
use_global_stats
Use global statistics.
If specified, the library uses mean and variance provided by the user as an input on forward propagation and does not compute their derivatives on backward propagation. Otherwise, the library computes mean and variance on forward propagation for training and inference, outputs them on forward propagation for training, and computes the respective derivatives on backward propagation.
use_scale
Use scale parameter.
If specified, the user is expected to pass scale as input on forward propagation. On backward propagation of type dnnl::prop_kind::backward, the library computes its derivative.
use_shift
Use shift parameter.
If specified, the user is expected to pass shift as input on forward propagation. On backward propagation of type dnnl::prop_kind::backward, the library computes its derivative.
fuse_norm_relu
Fuse normalization with ReLU.
On training, normalization will require the workspace to implement backward propagation. On inference, the workspace is not required and behavior is the same as when normalization is fused with ReLU using the post-ops API.
fuse_norm_add_relu
Fuse normalization with elementwise binary Add and then fuse with ReLU.
On training, normalization will require the workspace to implement backward propagation. On inference, the workspace is not required.