LeakyRelu#

LeakyRelu - 16#

Version

  • name: LeakyRelu (GitHub)

  • domain: main

  • since_version: 16

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 16.

Summary

LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.

History - Version 16 adds bfloat16 to the types allowed.

Attributes

  • alpha: Coefficient of leakage. Default value is 0.009999999776482582.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(bfloat16), tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

Examples

default

node = onnx.helper.make_node(
    'LeakyRelu',
    inputs=['x'],
    outputs=['y'],
    alpha=0.1
)

x = np.array([-1, 0, 1]).astype(np.float32)
# expected output [-0.1, 0., 1.]
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * 0.1
expect(node, inputs=[x], outputs=[y],
       name='test_leakyrelu_example')

x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * 0.1
expect(node, inputs=[x], outputs=[y],
       name='test_leakyrelu')

_leakyrelu_default

default_alpha = 0.01
node = onnx.helper.make_node(
    'LeakyRelu',
    inputs=['x'],
    outputs=['y'],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x, 0, np.inf) + np.clip(x, -np.inf, 0) * default_alpha
expect(node, inputs=[x], outputs=[y],
       name='test_leakyrelu_default')

Differences

00LeakyRelu takes input data (Tensor) and an argument alpha, and produces oneLeakyRelu takes input data (Tensor) and an argument alpha, and produces one
11output data (Tensor) where the function f(x) = alpha * x for x < 0,output data (Tensor) where the function f(x) = alpha * x for x < 0,
22f(x) = x for x >= 0, is applied to the data tensor elementwise.f(x) = x for x >= 0, is applied to the data tensor elementwise.
33
4**History**
5- Version 16 adds bfloat16 to the types allowed.
6
47**Attributes****Attributes**
58
69* **alpha**:* **alpha**:
710 Coefficient of leakage. Default value is 0.009999999776482582. Coefficient of leakage. Default value is 0.009999999776482582.
811
912**Inputs****Inputs**
1013
1114* **X** (heterogeneous) - **T**:* **X** (heterogeneous) - **T**:
1215 Input tensor Input tensor
1316
1417**Outputs****Outputs**
1518
1619* **Y** (heterogeneous) - **T**:* **Y** (heterogeneous) - **T**:
1720 Output tensor Output tensor
1821
1922**Type Constraints****Type Constraints**
2023
2124* **T** in (* **T** in (
25 tensor(bfloat16),
2226 tensor(double), tensor(double),
2327 tensor(float), tensor(float),
2428 tensor(float16) tensor(float16)
2529 ): ):
2630 Constrain input and output types to float tensors. Constrain input and output types to float tensors.

LeakyRelu - 6#

Version

  • name: LeakyRelu (GitHub)

  • domain: main

  • since_version: 6

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 6.

Summary

LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.

Attributes

  • alpha: Coefficient of leakage. Default value is 0.009999999776482582.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

Differences

00LeakyRelu takes input data (Tensor) and an argument alpha, and produces oneLeakyRelu takes input data (Tensor) and an argument alpha, and produces one
11output data (Tensor) where the function f(x) = alpha * x for x < 0,output data (Tensor) where the function f(x) = alpha * x for x < 0,
22f(x) = x for x >= 0, is applied to the data tensor elementwise.f(x) = x for x >= 0, is applied to the data tensor elementwise.
33
44**Attributes****Attributes**
55
66* **alpha**:* **alpha**:
77 Coefficient of leakage default to 0.01. Default value is 0.009999999776482582. Coefficient of leakage. Default value is 0.009999999776482582.
8* **consumed_inputs**:
9 legacy optimization attribute.
108
119**Inputs****Inputs**
1210
1311* **X** (heterogeneous) - **T**:* **X** (heterogeneous) - **T**:
1412 Input tensor Input tensor
1513
1614**Outputs****Outputs**
1715
1816* **Y** (heterogeneous) - **T**:* **Y** (heterogeneous) - **T**:
1917 Output tensor Output tensor
2018
2119**Type Constraints****Type Constraints**
2220
2321* **T** in (* **T** in (
2422 tensor(double), tensor(double),
2523 tensor(float), tensor(float),
2624 tensor(float16) tensor(float16)
2725 ): ):
2826 Constrain input and output types to float tensors. Constrain input and output types to float tensors.

LeakyRelu - 1#

Version

  • name: LeakyRelu (GitHub)

  • domain: main

  • since_version: 1

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: False

This version of the operator has been available since version 1.

Summary

LeakyRelu takes input data (Tensor<T>) and an argument alpha, and produces one output data (Tensor<T>) where the function f(x) = alpha * x for x < 0, f(x) = x for x >= 0, is applied to the data tensor elementwise.

Attributes

  • alpha: Coefficient of leakage default to 0.01. Default value is 0.009999999776482582.

  • consumed_inputs: legacy optimization attribute.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.