module utils.onnx_function#

Short summary#

module onnxcustom.utils.onnx_function

Onnx helper.

source on GitHub

Functions#

function

truncated documentation

_onnx_axpy

Returns the ONNX graph for function Y = f(X1, X2, \alpha) = \alpha X1 + X2.

_onnx_axpyw

Returns the ONNX graph for function Y, Z = f(X1, X2, G, \alpha, \beta) = (Y, Z) where Z = \beta G + \alpha X1

_onnx_axpyw2

Returns the ONNX graph for function Y, Z = f(X1, X2, G, \alpha, \beta) = (Y, Z) where Z = \beta G + \alpha X1

_onnx_copy

Returns the ONNX graph for function Y = X.

_onnx_grad_loss_absolute_error

Returns the ONNX graph for function Y = f(X1, X2) = \lVert X1 - X2 \rVert or Y = f(X1, X2) = \lVert (X1 - X2)w \rVert

_onnx_grad_loss_elastic_error

Returns the ONNX graph for function Y = f(X1, X2) = \beta \lVert X1 - X2 \rVert + \alpha \lVert X1 - X2 \rVert^2

_onnx_grad_loss_square_error

Returns the ONNX graph for function Y = f(X1, X2) = \lVert (X1 - X2) \rVert ^2 or Y = f(X1, X2) = \lVert (\sqrt{w}(X1 - X2) \rVert ^2 w

_onnx_grad_penalty_elastic_error

Returns the ONNX graph for function Y = f(W) = \beta \lVert W \rVert + \alpha \lVert W \rVert^2 l1_weight

_onnx_grad_sigmoid_neg_log_loss_error

The function the raw scores from a classifier, uses the sigmoid function to compute probabilities, then the log function …

_onnx_grad_square_error

Returns the ONNX graph for the gradient of function Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 or Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 w

_onnx_linear_regression

Returns the ONNX graph for function Y = f(X, A, B) = A X + B.

_onnx_n_penalty_elastic_error

Returns the ONNX graph for function Y = f(W) = \beta \lVert W \rVert + \alpha \lVert W \rVert^2 l1_weight

_onnx_square_error

Returns the ONNX graph for function Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 or Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 w

_onnx_update_penalty_elastic_error

Returns the ONNX graph for function Y = f(W) = W - 2 \beta W - \alpha sign(W) l1 is \beta and …

_onnx_zero

Returns the ONNX graph for function Y = X * 0.

function_onnx_graph

Returns the ONNX graph corresponding to a function.

get_supported_functions

Returns the list of supported function by function_onnx_graph().

Documentation#

Onnx helper.

source on GitHub

onnxcustom.utils.onnx_function._onnx_axpy(target_opset=None, dtype=<class 'numpy.float32'>)#

Returns the ONNX graph for function Y = f(X1, X2, \alpha) = \alpha X1 + X2.

source on GitHub

onnxcustom.utils.onnx_function._onnx_axpyw(target_opset=None, dtype=<class 'numpy.float32'>)#

Returns the ONNX graph for function Y, Z = f(X1, X2, G, \alpha, \beta) = (Y, Z) where Z = \beta G + \alpha X1 and Y = Z + X2.

source on GitHub

onnxcustom.utils.onnx_function._onnx_axpyw2(target_opset=None, dtype=<class 'numpy.float32'>)#

Returns the ONNX graph for function Y, Z = f(X1, X2, G, \alpha, \beta) = (Y, Z) where Z = \beta G + \alpha X1 and Y = \beta * Z + \alpha X1 + X2.

source on GitHub

onnxcustom.utils.onnx_function._onnx_copy(target_opset=None, dtype=<class 'numpy.float32'>)#

Returns the ONNX graph for function Y = X.

source on GitHub

onnxcustom.utils.onnx_function._onnx_grad_loss_absolute_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None)#

Returns the ONNX graph for function Y = f(X1, X2) = \lVert X1 - X2 \rVert or Y = f(X1, X2) = \lVert (X1 - X2)w \rVert if weight_name is not None and its gradient.

source on GitHub

onnxcustom.utils.onnx_function._onnx_grad_loss_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, l1_weight=0.01, l2_weight=0.01)#

Returns the ONNX graph for function Y = f(X1, X2) = \beta \lVert X1 - X2 \rVert +
\alpha \lVert X1 - X2 \rVert^2 or Y = f(X1, X2) = \beta \lVert w(X1 - X2) \rVert +
\alpha \lVert (\sqrt{w})(X1 - X2) \rVert^2 if weight_name is not None and its gradient. l1_weight is \beta and l2_weight is \alpha.

source on GitHub

onnxcustom.utils.onnx_function._onnx_grad_loss_square_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, multiply=2)#

Returns the ONNX graph for function Y = f(X1, X2) = \lVert (X1 - X2) \rVert ^2 or Y = f(X1, X2) = \lVert (\sqrt{w}(X1 - X2) \rVert ^2 w if weight_name is not None and its gradient.

source on GitHub

onnxcustom.utils.onnx_function._onnx_grad_penalty_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, l1_weight=0.01, l2_weight=0.01)#

Returns the ONNX graph for function Y = f(W) = \beta \lVert W \rVert +
\alpha \lVert W \rVert^2 l1_weight is \beta and l2_weight is \alpha.

source on GitHub

onnxcustom.utils.onnx_function._onnx_grad_sigmoid_neg_log_loss_error(target_opset=None, dtype=<class 'numpy.float32'>, eps=1e-05, weight_name=None)#

The function the raw scores from a classifier, uses the sigmoid function to compute probabilities, then the log function to compute the loss. It creates the ONNX graph for this function and the associated gradient of the loss against the raw scores.

Probabilites (class 1): p(s) = \frac{1}{1 + \exp(-s)}. Loss (for two classes): L(y, s) = (1 - y)\log(1 - p(s)) +
y \log(p(s)). Gradient \frac{dL(y, s)}{ds} = y - p(s). To avoid nan values, probabilies are clipped: p(s) = \max(\min(p(s), 1 - \epsilon), \epsilon). y \in \{0, 1\} (integer). s is a float.

Parameters:

eps – to clip probabilities and avoid computing log(0)

source on GitHub

onnxcustom.utils.onnx_function._onnx_grad_square_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None)#

Returns the ONNX graph for the gradient of function Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 or Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 w if weight_name is not None

source on GitHub

onnxcustom.utils.onnx_function._onnx_linear_regression(target_opset=None, dtype=<class 'numpy.float32'>)#

Returns the ONNX graph for function Y = f(X, A, B) = A X + B.

source on GitHub

onnxcustom.utils.onnx_function._onnx_n_penalty_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, l1_weight=0.01, l2_weight=0.01, n_tensors=1, loss_shape=(1, 1))#

Returns the ONNX graph for function Y = f(W) = \beta \lVert W \rVert +
\alpha \lVert W \rVert^2 l1_weight is \beta and l2_weight is \alpha. It does that for n_tensors and adds all of the results to an input loss.

source on GitHub

onnxcustom.utils.onnx_function._onnx_square_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None)#

Returns the ONNX graph for function Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 or Y = f(X1, X2) = \lVert X1 - X2 \rVert ^2 w if weight_name is not None

source on GitHub

onnxcustom.utils.onnx_function._onnx_update_penalty_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, l1=0.0001, l2=0.0001)#

Returns the ONNX graph for function Y = f(W) = W - 2 \beta W - \alpha sign(W) l1 is \beta and l2 is \alpha.

source on GitHub

onnxcustom.utils.onnx_function._onnx_zero(target_opset=None, dtype=<class 'numpy.float32'>)#

Returns the ONNX graph for function Y = X * 0.

source on GitHub

onnxcustom.utils.onnx_function.function_onnx_graph(name, target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, **kwargs)#

Returns the ONNX graph corresponding to a function.

Parameters:
  • name – name

  • target_opset – opset version, if None, target_opset is replaced by the latest supported opset defined in the main __init__.py of this package in __max_supported_opset__

  • dtype – computation type

  • weight_name – weight name if any

  • kwargs – additional parameters

Returns:

ONNX graph

A wrong name will raise an exception giving the whole of supported function. One example with function square_error:

An example on how to use it:

<<<

import numpy
from onnxruntime import InferenceSession
from onnxcustom.utils.onnx_function import function_onnx_graph

model_onnx = function_onnx_graph('square_error')
sess = InferenceSession(model_onnx.SerializeToString())
res = sess.run(None, {
    'X1': numpy.array([[0, 1]], dtype=numpy.float32).T,
    'X2': numpy.array([[1, 2]], dtype=numpy.float32).T})
print(res[0])

>>>

    [2.]

List of supported functions:

<<<

from onnxcustom.utils.onnx_function import get_supported_functions
print("\n".join(sorted(get_supported_functions())))

>>>

    axpy
    axpyw
    axpyw2
    copy
    grad_loss_absolute_error
    grad_loss_elastic_error
    grad_loss_square_error
    grad_penalty_elastic_error
    grad_sigmoid_neg_log_loss_error
    grad_square_error
    linear_regression
    n_penalty_elastic_error
    square_error
    update_penalty_elastic_error
    zero

source on GitHub

onnxcustom.utils.onnx_function.get_supported_functions()#

Returns the list of supported function by function_onnx_graph.

source on GitHub