Shuffle¶
General¶
The shuffle primitive shuffles data along the shuffle axis (here designated as \(C\)) with group parameter \(G\). If the shuffle axis is thought of as a \((\frac{C}{G} \times G)\) matrix in row-major order, then the shuffle operation transposes the shuffle axis to a \((G \times \frac{C}{G})\) matrix in row-major order.
Forward¶
The formal definition is as follows (variable names follow the standard Naming Conventions):
where
\(c\) dimension is called a shuffle axis,
\(G\) is a
group_size
,\(\overline{ou}\) is the outermost indices (to the left from shuffle axis),
\(\overline{in}\) is the innermost indices (to the right from shuffle axis), and
\(c'\) and \(c\) relate to each other as define by the system:
\[\begin{split}\begin{cases} c &= u + v\frac{C}{G}, \\ c' &= uG + v, \\ \end{cases}\end{split}\]
Here, \(0 \leq u < \frac{C}{G}\) and \(0 \leq v < G\).
Difference Between Forward Training and Forward Inference¶
There is no difference between the dnnl_forward_training and dnnl_forward_inference propagation kinds.
Backward¶
The backward propagation computes \(\diffsrc(\overline{ou}, c', \overline{in})\), based on \(\diffdst(\overline{ou}, c, \overline{in})\).
Essentially, backward propagation is the same as forward propagation with \(G\) replaced by \(C / G\).
Execution Arguments¶
When executed, the inputs and outputs should be mapped to an execution argument index as specified by the following table.
Primitive input/output |
Execution argument index |
---|---|
\(\src\) |
DNNL_ARG_SRC |
\(\dst\) |
DNNL_ARG_DST |
\(\diffsrc\) |
DNNL_ARG_DIFF_SRC |
\(\diffdst\) |
DNNL_ARG_DIFF_DST |
Data Types¶
The shuffle primitive supports the following combinations of data types:
Propagation |
Source / Destination |
---|---|
forward / backward |
f32, bf16, f16 |
forward |
s32, s8, u8 |
Warning
There might be hardware and/or implementation specific restrictions. Check the Implementation Limitations section below.
Data Layouts¶
The shuffle primitive works with arbitrary data tensors. There is no special meaning associated with any logical dimensions. However, the shuffle axis is typically referred to as channels (hence in formulas we use \(c\)).
Shuffle operation typically appear in CNN topologies. Hence, in the library the shuffle primitive is optimized for the corresponding memory formats:
Spatial |
Logical tensor |
Shuffle Axis |
Implementations optimized for memory formats |
---|---|---|---|
2D |
NCHW |
1 (C) |
dnnl_nchw ( dnnl_abcd ), dnnl_nhwc ( dnnl_acdb ), optimized^ |
3D |
NCDHW |
1 (C) |
dnnl_ncdhw ( dnnl_abcde ), dnnl_ndhwc ( dnnl_acdeb ), optimized^ |
Here optimized^ means the format that comes out of any preceding compute-intensive primitive.
Post-Ops and Attributes¶
The shuffle primitive does not support any post-ops or attributes.
Implementation Limitations¶
Refer to Data Types for limitations related to data types support.
GPU
Only tensors of 6 or fewer dimensions are supported.
Performance Tips¶
N/A
Example¶
This C++ API example demonstrates how to create and execute a Shuffle primitive.
Key optimizations included in this example:
Shuffle along axis 1 (channels).