# Softmax¶

API Reference

## General¶

The softmax primitive performs forward or backward softmax or logsoftmax operation along a particular axis on data with arbitrary dimensions. All other axes are treated as independent (batch).

### Forward¶

In general form, the operation is defined by the following formulas (the variable names follow the standard Naming Conventions).

Softmax:

$\dst(\overline{ou}, c, \overline{in}) = \frac {e^{\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})}} { \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} }$

Logsoftmax:

$\dst(\overline{ou}, c, \overline{in}) = \ln\left({\frac { e^{\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})} } { \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} }}\right) = \left(\src(\overline{ou}, c, \overline{in}) - \nu(\overline{ou}, \overline{in})\right) - \ln\left( \sum\limits_{ic} e^{\src(\overline{ou}, ic, \overline{in}) - \nu(\overline{ou}, \overline{in})} \right)$

Above

• $$c$$ is the axis over which the operation is computed on,

• $$\overline{ou}$$ is the outermost index (to the left of the axis),

• $$\overline{in}$$ is the innermost index (to the right of the axis), and

• $$\nu$$ is used to produce numerically stable results and defined as:

$\nu(\overline{ou}, \overline{in}) = \max\limits_{ic} \src(\overline{ou}, ic, \overline{in})$

#### Difference Between Forward Training and Forward Inference¶

There is no difference between the dnnl_forward_training and dnnl_forward_inference propagation kinds.

### Backward¶

The backward propagation computes $$\diffsrc(ou, c, in)$$, based on $$\diffdst(ou, c, in)$$ and $$\dst(ou, c, in)$$.

## Execution Arguments¶

When executed, the inputs and outputs should be mapped to an execution argument index as specified by the following table.

Primitive input/output

Execution argument index

$$\src$$

DNNL_ARG_SRC

$$\dst$$

DNNL_ARG_DST

$$\diffsrc$$

DNNL_ARG_DIFF_SRC

$$\diffdst$$

DNNL_ARG_DIFF_DST

## Implementation Details¶

### General Notes¶

1. Both forward and backward propagation support in-place operations, meaning that src can be used as input and output for forward propagation, and diff_dst can be used as input and output for backward propagation. In case of in-place operation, the original data will be overwritten. This support is limited to cases when data types of src / dst or diff_src / diff_dst are identical.

### Post-ops and Attributes¶

Attributes enable you to modify the behavior of the softmax primitive. The following attributes are supported by the softmax primitive:

Propagation

Type

Operation

Description

Restrictions

forward

attribute

Output scale

Scales the result of softmax by given scale factor

int8 softmax only, zero mask only

### Data Type Support¶

The softmax primitive supports the following combinations of data types:

Propagation

Source

Destination

forward

f32, bf16, f16, u8, s8

f32, bf16, f16, u8, s8

backward

f32, bf16, f16

f32, bf16, f16

### Data Representation¶

#### Source, Destination, and Their Gradients¶

The softmax primitive works with arbitrary data tensors. There is no special meaning associated with any logical dimensions. However, the softmax axis is typically referred to as channels (hence in formulas we use $$c$$).

## Implementation Limitations¶

1. Refer to Data Types for limitations related to data types support.

2. GPU

• Only tensors of 6 or fewer dimensions are supported.

## Performance Tips¶

1. Use in-place operations whenever possible.

2. Currently the softmax primitive is optimized for the cases where the dimension of the softmax axis is physically dense. For instance:

• Optimized: 2D case, tensor $$A \times B$$, softmax axis 1 (B), format tag dnnl_ab

• Optimized: 4D case, tensor $$A \times B \times C \times D$$, softmax axis 3 (D), format tag dnnl_abcd

• Optimized: 4D case, tensor $$A \times B \times C \times D$$, softmax axis 1 (B), format tag dnnl_abcd, and $$C = D = 1$$

• Optimized: 4D case, tensor $$A \times B \times C \times D$$, softmax axis 1 (B), format tag dnnl_acdb or dnnl_aBcd16b, and $$C \cdot D \ne 1$$

• Non-optimized: 2D case, tensor $$A \times B$$, softmax axis 0 (A), format tag dnnl_ab, and $$B \ne 1$$

• Non-optimized: 2D case, tensor $$A \times B$$, softmax axis 1 (B), format tag dnnl_ba, and $$A \ne 1$$

• Non-optimized: 4D case, tensor $$A \times B \times C \times D$$, softmax axis 2 (C), format tag dnnl_acdb, and and $$D \cdot B \ne 1$$

## Example¶

Softmax Primitive Example

This C++ API example demonstrates how to create and execute a Softmax primitive in forward training propagation mode.

Key optimizations included in this example:

• In-place primitive execution;

• Softmax along axis 1 (C) for 2D tensors.