com.microsoft - Gelu#
name: Gelu (GitHub)
This version of the operator has been available since version 1 of domain com.microsoft.
Gaussian Error Linear Unit. A high-performing neural network activation function.The GELU nonlinearity is the expected transformation of a stochastic regularizer which randomly applies the identity or zero map to a neuron’s input. The GELU nonlinearity weights inputs by their magnitude, rather than gates inputs by their sign as in ReLUs.
X (heterogeneous) - T: The input data as Tensor.
Y (heterogeneous) - T: The output.