.. _l-onnx-doccom.microsoft-InplaceClipGradNorm: =================================== com.microsoft - InplaceClipGradNorm =================================== .. contents:: :local: .. _l-onnx-opcom-microsoft-inplaceclipgradnorm-1: InplaceClipGradNorm - 1 (com.microsoft) ======================================= **Version** * **name**: `InplaceClipGradNorm (GitHub) `_ * **domain**: **com.microsoft** * **since_version**: **1** * **function**: * **support_level**: * **shape inference**: This version of the operator has been available **since version 1 of domain com.microsoft**. **Summary** InplaceClipGradNorm operator, taking multiple gradients as inputs (seq). InplaceClipGradNorm should be used in conjunction with optimizers that accept seq gradients as input, since this op takes a sequence of tensors as input and outputs a sequence of tensors there by avoiding the need for SequenceConstruct (and making any unnecessary copy).Please note that the gradient clipping happens inplace. **Attributes** * **max_norm**: Coefficient of previously accumulated gradient in running average. Default value is ``?``. * **norm_type**: Type of normalization to perform during execution of clip grad norm.Currently, the only norm supported is the frobenius norm (which is also the default). Default value is ``?``. **Inputs** * **gradients** (heterogeneous) - **S_GRAD**: Sequence of gradients computed in this iteration. **Outputs** * **clipped_gradients** (heterogeneous) - **S_GRAD**: Gradients after being clipped as per given inputs and attributes. **Examples**