module utils.onnx_function
#
Short summary#
module onnxcustom.utils.onnx_function
Onnx helper.
Functions#
function |
truncated documentation |
---|---|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
The function the raw scores from a classifier, uses the sigmoid function to compute probabilities, then the log function … |
|
Returns the ONNX graph for the gradient of function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph for function |
|
Returns the ONNX graph corresponding to a function. |
|
Returns the list of supported function by |
Documentation#
Onnx helper.
- onnxcustom.utils.onnx_function._onnx_axpy(target_opset=None, dtype=<class 'numpy.float32'>)#
Returns the ONNX graph for function
.
- onnxcustom.utils.onnx_function._onnx_axpyw(target_opset=None, dtype=<class 'numpy.float32'>)#
Returns the ONNX graph for function
where
and
.
- onnxcustom.utils.onnx_function._onnx_axpyw2(target_opset=None, dtype=<class 'numpy.float32'>)#
Returns the ONNX graph for function
where
and
.
- onnxcustom.utils.onnx_function._onnx_copy(target_opset=None, dtype=<class 'numpy.float32'>)#
Returns the ONNX graph for function
.
- onnxcustom.utils.onnx_function._onnx_grad_loss_absolute_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None)#
Returns the ONNX graph for function
or
if weight_name is not None and its gradient.
- onnxcustom.utils.onnx_function._onnx_grad_loss_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, l1_weight=0.01, l2_weight=0.01)#
Returns the ONNX graph for function
or
if weight_name is not None and its gradient. l1_weight is
and l2_weight is
.
- onnxcustom.utils.onnx_function._onnx_grad_loss_square_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, multiply=2)#
Returns the ONNX graph for function
or
if weight_name is not None and its gradient.
- onnxcustom.utils.onnx_function._onnx_grad_penalty_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, l1_weight=0.01, l2_weight=0.01)#
Returns the ONNX graph for function
l1_weight is
and l2_weight is
.
- onnxcustom.utils.onnx_function._onnx_grad_sigmoid_neg_log_loss_error(target_opset=None, dtype=<class 'numpy.float32'>, eps=1e-05, weight_name=None)#
The function the raw scores from a classifier, uses the sigmoid function to compute probabilities, then the log function to compute the loss. It creates the ONNX graph for this function and the associated gradient of the loss against the raw scores.
Probabilites (class 1):
. Loss (for two classes):
. Gradient
. To avoid nan values, probabilies are clipped:
.
(integer). s is a float.
- Parameters:
eps – to clip probabilities and avoid computing log(0)
- onnxcustom.utils.onnx_function._onnx_grad_square_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None)#
Returns the ONNX graph for the gradient of function
or
if weight_name is not None
- onnxcustom.utils.onnx_function._onnx_linear_regression(target_opset=None, dtype=<class 'numpy.float32'>)#
Returns the ONNX graph for function
.
- onnxcustom.utils.onnx_function._onnx_n_penalty_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, l1_weight=0.01, l2_weight=0.01, n_tensors=1, loss_shape=(1, 1))#
Returns the ONNX graph for function
l1_weight is
and l2_weight is
. It does that for n_tensors and adds all of the results to an input loss.
- onnxcustom.utils.onnx_function._onnx_square_error(target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None)#
Returns the ONNX graph for function
or
if weight_name is not None
- onnxcustom.utils.onnx_function._onnx_update_penalty_elastic_error(target_opset=None, dtype=<class 'numpy.float32'>, l1=0.0001, l2=0.0001)#
Returns the ONNX graph for function
l1 is
and l2 is
.
- onnxcustom.utils.onnx_function._onnx_zero(target_opset=None, dtype=<class 'numpy.float32'>)#
Returns the ONNX graph for function
.
- onnxcustom.utils.onnx_function.function_onnx_graph(name, target_opset=None, dtype=<class 'numpy.float32'>, weight_name=None, **kwargs)#
Returns the ONNX graph corresponding to a function.
- Parameters:
name – name
target_opset – opset version, if None, target_opset is replaced by the latest supported opset defined in the main __init__.py of this package in __max_supported_opset__
dtype – computation type
weight_name – weight name if any
kwargs – additional parameters
- Returns:
ONNX graph
A wrong name will raise an exception giving the whole of supported function. One example with function square_error:
An example on how to use it:
<<<
import numpy from onnxruntime import InferenceSession from onnxcustom.utils.onnx_function import function_onnx_graph model_onnx = function_onnx_graph('square_error') sess = InferenceSession(model_onnx.SerializeToString()) res = sess.run(None, { 'X1': numpy.array([[0, 1]], dtype=numpy.float32).T, 'X2': numpy.array([[1, 2]], dtype=numpy.float32).T}) print(res[0])
>>>
[2.]
List of supported functions:
<<<
from onnxcustom.utils.onnx_function import get_supported_functions print("\n".join(sorted(get_supported_functions())))
>>>
axpy axpyw axpyw2 copy grad_loss_absolute_error grad_loss_elastic_error grad_loss_square_error grad_penalty_elastic_error grad_sigmoid_neg_log_loss_error grad_square_error linear_regression n_penalty_elastic_error square_error update_penalty_elastic_error zero
- onnxcustom.utils.onnx_function.get_supported_functions()#
Returns the list of supported function by
function_onnx_graph
.