HardSigmoid#

HardSigmoid - 6#

Version

  • name: HardSigmoid (GitHub)

  • domain: main

  • since_version: 6

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: True

This version of the operator has been available since version 6.

Summary

HardSigmoid takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the HardSigmoid function, y = max(0, min(1, alpha * x + beta)), is applied to the tensor elementwise.

Attributes

  • alpha: Value of alpha. Default value is 0.20000000298023224.

  • beta: Value of beta. Default value is 0.5.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.

Examples

default

node = onnx.helper.make_node(
    'HardSigmoid',
    inputs=['x'],
    outputs=['y'],
    alpha=0.5,
    beta=0.6
)

x = np.array([-1, 0, 1]).astype(np.float32)
y = np.clip(x * 0.5 + 0.6, 0, 1)  # expected output [0.1, 0.6, 1.]
expect(node, inputs=[x], outputs=[y],
       name='test_hardsigmoid_example')

x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x * 0.5 + 0.6, 0, 1)
expect(node, inputs=[x], outputs=[y],
       name='test_hardsigmoid')

_hardsigmoid_default

default_alpha = 0.2
default_beta = 0.5
node = onnx.helper.make_node(
    'HardSigmoid',
    inputs=['x'],
    outputs=['y'],
)
x = np.random.randn(3, 4, 5).astype(np.float32)
y = np.clip(x * default_alpha + default_beta, 0, 1)
expect(node, inputs=[x], outputs=[y],
       name='test_hardsigmoid_default')

Differences

00HardSigmoid takes one input data (Tensor) and produces one output dataHardSigmoid takes one input data (Tensor) and produces one output data
11(Tensor) where the HardSigmoid function, y = max(0, min(1, alpha * x + beta)),(Tensor) where the HardSigmoid function, y = max(0, min(1, alpha * x + beta)),
22is applied to the tensor elementwise.is applied to the tensor elementwise.
33
44**Attributes****Attributes**
55
66* **alpha**:* **alpha**:
77 Value of alpha default to 0.2 Default value is 0.20000000298023224. Value of alpha. Default value is 0.20000000298023224.
88* **beta**:* **beta**:
99 Value of beta default to 0.5 Default value is 0.5. Value of beta. Default value is 0.5.
10* **consumed_inputs**:
11 legacy optimization attribute.
1210
1311**Inputs****Inputs**
1412
1513* **X** (heterogeneous) - **T**:* **X** (heterogeneous) - **T**:
1614 Input tensor Input tensor
1715
1816**Outputs****Outputs**
1917
2018* **Y** (heterogeneous) - **T**:* **Y** (heterogeneous) - **T**:
2119 Output tensor Output tensor
2220
2321**Type Constraints****Type Constraints**
2422
2523* **T** in (* **T** in (
2624 tensor(double), tensor(double),
2725 tensor(float), tensor(float),
2826 tensor(float16) tensor(float16)
2927 ): ):
3028 Constrain input and output types to float tensors. Constrain input and output types to float tensors.

HardSigmoid - 1#

Version

  • name: HardSigmoid (GitHub)

  • domain: main

  • since_version: 1

  • function: False

  • support_level: SupportType.COMMON

  • shape inference: False

This version of the operator has been available since version 1.

Summary

HardSigmoid takes one input data (Tensor<T>) and produces one output data (Tensor<T>) where the HardSigmoid function, y = max(0, min(1, alpha * x + beta)), is applied to the tensor elementwise.

Attributes

  • alpha: Value of alpha default to 0.2 Default value is 0.20000000298023224.

  • beta: Value of beta default to 0.5 Default value is 0.5.

  • consumed_inputs: legacy optimization attribute.

Inputs

  • X (heterogeneous) - T: Input tensor

Outputs

  • Y (heterogeneous) - T: Output tensor

Type Constraints

  • T in ( tensor(double), tensor(float), tensor(float16) ): Constrain input and output types to float tensors.