The logsoftmax primitive performs softmax along a particular axis on data with arbitrary dimensions followed by the logarithm function. All other axes are treated as independent (batch).
In general form, the operation is defined by the following formulas (the variable names follow the standard Naming Conventions). Second form is used as more numerically stable:
\[ \dst(\overline{ou}, c, \overline{in}) = \ln\left({\frac { e^{\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})} } { \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} }}\right) = \left(\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})\right) - \ln\left( \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} \right), \]
where
\(\nu\) is used to produce more accurate results and defined as:
\[ \nu(\overline{ou}, \overline{in}) = \max\limits_{ic} \src(\overline{ou}, ic, \overline{in}) \]
There is no difference between the dnnl_forward_training and dnnl_forward_inference propagation kinds.
The backward propagation computes \(\diffsrc(ou, c, in)\), based on \(\diffdst(ou, c, in)\) and \(\dst(ou, c, in)\).
When executed, the inputs and outputs should be mapped to an execution argument index as specified by the following table.
Primitive input/output | Execution argument index |
---|---|
\(\src\) | DNNL_ARG_SRC |
\(\dst\) | DNNL_ARG_DST |
\(\diffsrc\) | DNNL_ARG_DIFF_SRC |
\(\diffdst\) | DNNL_ARG_DIFF_DST |
src
can be used as input and output for forward propagation, and diff_dst
can be used as input and output for backward propagation. In case of in-place operation, the original data will be overwritten.The logsoftmax primitive doesn't support any post-ops or attributes.
The logsoftmax primitive supports the following combinations of data types:
Propagation | Sou |
---|---|
forward / backward | f32 |
The logsoftmax primitive works with arbitrary data tensors. There is no special meaning associated with any logical dimensions. However, the logsoftmax axis is typically referred to as channels (hence in formulas we use \(c\)).