enum dnnl::algorithm

Overview

Kinds of algorithms. More…

#include <dnnl.hpp>

enum algorithm
{
    undef                            = dnnl_alg_kind_undef,
    convolution_auto                 = dnnl_convolution_auto,
    convolution_direct               = dnnl_convolution_direct,
    convolution_winograd             = dnnl_convolution_winograd,
    deconvolution_direct             = dnnl_deconvolution_direct,
    deconvolution_winograd           = dnnl_deconvolution_winograd,
    eltwise_relu                     = dnnl_eltwise_relu,
    eltwise_tanh                     = dnnl_eltwise_tanh,
    eltwise_elu                      = dnnl_eltwise_elu,
    eltwise_square                   = dnnl_eltwise_square,
    eltwise_abs                      = dnnl_eltwise_abs,
    eltwise_sqrt                     = dnnl_eltwise_sqrt,
    eltwise_swish                    = dnnl_eltwise_swish,
    eltwise_linear                   = dnnl_eltwise_linear,
    eltwise_bounded_relu             = dnnl_eltwise_bounded_relu,
    eltwise_soft_relu                = dnnl_eltwise_soft_relu,
    eltwise_logsigmoid               = dnnl_eltwise_logsigmoid,
    eltwise_mish                     = dnnl_eltwise_mish,
    eltwise_logistic                 = dnnl_eltwise_logistic,
    eltwise_exp                      = dnnl_eltwise_exp,
    eltwise_gelu                     = dnnl_eltwise_gelu,
    eltwise_gelu_tanh                = dnnl_eltwise_gelu_tanh,
    eltwise_gelu_erf                 = dnnl_eltwise_gelu_erf,
    eltwise_log                      = dnnl_eltwise_log,
    eltwise_clip                     = dnnl_eltwise_clip,
    eltwise_clip_v2                  = dnnl_eltwise_clip_v2,
    eltwise_pow                      = dnnl_eltwise_pow,
    eltwise_round                    = dnnl_eltwise_round,
    eltwise_hardswish                = dnnl_eltwise_hardswish,
    eltwise_relu_use_dst_for_bwd     = dnnl_eltwise_relu_use_dst_for_bwd,
    eltwise_tanh_use_dst_for_bwd     = dnnl_eltwise_tanh_use_dst_for_bwd,
    eltwise_elu_use_dst_for_bwd      = dnnl_eltwise_elu_use_dst_for_bwd,
    eltwise_sqrt_use_dst_for_bwd     = dnnl_eltwise_sqrt_use_dst_for_bwd,
    eltwise_logistic_use_dst_for_bwd = dnnl_eltwise_logistic_use_dst_for_bwd,
    eltwise_exp_use_dst_for_bwd      = dnnl_eltwise_exp_use_dst_for_bwd,
    eltwise_clip_v2_use_dst_for_bwd  = dnnl_eltwise_clip_v2_use_dst_for_bwd,
    lrn_across_channels              = dnnl_lrn_across_channels,
    lrn_within_channel               = dnnl_lrn_within_channel,
    pooling_max                      = dnnl_pooling_max,
    pooling_avg                      = dnnl_pooling_avg,
    pooling_avg_include_padding      = dnnl_pooling_avg_include_padding,
    pooling_avg_exclude_padding      = dnnl_pooling_avg_exclude_padding,
    vanilla_rnn                      = dnnl_vanilla_rnn,
    vanilla_lstm                     = dnnl_vanilla_lstm,
    vanilla_gru                      = dnnl_vanilla_gru,
    lbr_gru                          = dnnl_lbr_gru,
    binary_add                       = dnnl_binary_add,
    binary_mul                       = dnnl_binary_mul,
    binary_max                       = dnnl_binary_max,
    binary_min                       = dnnl_binary_min,
    binary_div                       = dnnl_binary_div,
    binary_sub                       = dnnl_binary_sub,
    binary_ge                        = dnnl_binary_ge,
    binary_gt                        = dnnl_binary_gt,
    binary_le                        = dnnl_binary_le,
    binary_lt                        = dnnl_binary_lt,
    binary_eq                        = dnnl_binary_eq,
    binary_ne                        = dnnl_binary_ne,
    resampling_nearest               = dnnl_resampling_nearest,
    resampling_linear                = dnnl_resampling_linear,
    reduction_max                    = dnnl_reduction_max,
    reduction_min                    = dnnl_reduction_min,
    reduction_sum                    = dnnl_reduction_sum,
    reduction_mul                    = dnnl_reduction_mul,
    reduction_mean                   = dnnl_reduction_mean,
    reduction_norm_lp_max            = dnnl_reduction_norm_lp_max,
    reduction_norm_lp_sum            = dnnl_reduction_norm_lp_sum,
    reduction_norm_lp_power_p_max    = dnnl_reduction_norm_lp_power_p_max,
    reduction_norm_lp_power_p_sum    = dnnl_reduction_norm_lp_power_p_sum,
};

Detailed Documentation

Kinds of algorithms.

Enum Values

undef

Undefined algorithm.

convolution_auto

Convolution algorithm that is chosen to be either direct or Winograd automatically.

convolution_direct

Direct convolution.

convolution_winograd

Winograd convolution.

deconvolution_direct

Direct deconvolution.

deconvolution_winograd

Winograd deconvolution.

eltwise_relu

Elementwise: rectified linear unit (ReLU)

eltwise_tanh

Elementwise: hyperbolic tangent non-linearity (tanh)

eltwise_elu

Elementwise: exponential linear unit (ELU)

eltwise_square

Elementwise: square.

eltwise_abs

Elementwise: abs.

eltwise_sqrt

Elementwise: square root.

eltwise_swish

Elementwise: swish (\(x \cdot sigmoid(a \cdot x)\))

eltwise_linear

Elementwise: linear.

eltwise_bounded_relu

Elementwise: bounded_relu.

eltwise_soft_relu

Elementwise: soft_relu.

eltwise_logsigmoid

Elementwise: logsigmoid.

eltwise_mish

Elementwise: mish.

eltwise_logistic

Elementwise: logistic.

eltwise_exp

Elementwise: exponent.

eltwise_gelu

Elementwise: gelu alias for dnnl::algorithm::eltwise_gelu_tanh.

eltwise_gelu_tanh

Elementwise: tanh-based gelu.

eltwise_gelu_erf

Elementwise: erf-based gelu.

eltwise_log

Elementwise: natural logarithm.

eltwise_clip

Elementwise: clip.

eltwise_clip_v2

Eltwise: clip version 2.

eltwise_pow

Elementwise: pow.

eltwise_round

Elementwise: round.

eltwise_hardswish

Elementwise: hardswish.

eltwise_relu_use_dst_for_bwd

Elementwise: rectified linar unit (ReLU) (dst for backward)

eltwise_tanh_use_dst_for_bwd

Elementwise: hyperbolic tangent non-linearity (tanh) (dst for backward)

eltwise_elu_use_dst_for_bwd

Elementwise: exponential linear unit (ELU) (dst for backward)

eltwise_sqrt_use_dst_for_bwd

Elementwise: square root (dst for backward)

eltwise_logistic_use_dst_for_bwd

Elementwise: logistic (dst for backward)

eltwise_exp_use_dst_for_bwd

Elementwise: exponent (dst for backward)

eltwise_clip_v2_use_dst_for_bwd

Elementwise: clip version 2 (dst for backward)

lrn_across_channels

Local response normalization (LRN) across multiple channels.

lrn_within_channel

LRN within a single channel.

pooling_max

Max pooling.

pooling_avg

Average pooling exclude padding, alias for dnnl::algorithm::pooling_avg_exclude_padding.

pooling_avg_include_padding

Average pooling include padding.

pooling_avg_exclude_padding

Average pooling exclude padding.

vanilla_rnn

RNN cell.

vanilla_lstm

LSTM cell.

vanilla_gru

GRU cell.

lbr_gru

GRU cell with linear before reset.

Differs from the vanilla GRU in how the new memory gate is calculated: \(c_t = tanh(W_c*x_t + b_{c_x} + r_t*(U_c*h_{t-1}+b_{c_h}))\) LRB GRU expects 4 bias tensors on input: \([b_{u}, b_{r}, b_{c_x}, b_{c_h}]\)

binary_add

Binary add.

binary_mul

Binary mul.

binary_max

Binary max.

binary_min

Binary min.

binary_div

Binary div.

binary_sub

Binary sub.

binary_ge

Binary greater than or equal.

binary_gt

Binary greater than.

binary_le

Binary less than or equal.

binary_lt

Binary less than.

binary_eq

Binary equal.

binary_ne

Binary not equal.

resampling_nearest

Nearest Neighbor resampling method.

resampling_linear

Linear (Bilinear, Trilinear) resampling method.

reduction_max

Reduction using max operation.

reduction_min

Reduction using min operation.

reduction_sum

Reduction using sum operation.

reduction_mul

Reduction using mul operation.

reduction_mean

Reduction using mean operation.

reduction_norm_lp_max

Reduction using norm_lp_max operation.

reduction_norm_lp_sum

Reduction using norm_lp_sum operation.

reduction_norm_lp_power_p_max

Reduction using norm_lp_power_p_max operation.

reduction_norm_lp_power_p_sum

Reduction using norm_lp_power_p_sum operation.