API Summary

Summary of public functions and classes exposed in onnxmltools.

Converters

onnxmltools.convert.coreml.convert(model, name=None, initial_types=None, doc_string='', target_opset=None, targeted_onnx='1.10.2', custom_conversion_functions=None, custom_shape_calculators=None)[source]

This function converts the specified CoreML model into its ONNX counterpart. Some information such as the produced ONNX model name can be specified.

Parameters
  • model – A CoreML model or a CoreML MLModel object

  • initial_types – A list providing some types for some root variables. Each element is a tuple of a variable name and a type defined in data_types.py.

  • name – The name of the graph (type: GraphProto) in the produced ONNX model (type: ModelProto)

  • doc_string – A string attached onto the produced ONNX model

  • target_opset – number, for example, 7 for ONNX 1.2, and 8 for ONNX 1.3.

  • targeted_onnx – A string (for example, ‘1.1.2’ and ‘1.2’) used to specify the targeted ONNX version of the produced model. If ONNXMLTools cannot find a compatible ONNX python package, an error may be thrown.

  • custom_conversion_functions – a dictionary for specifying the user customized conversion function

  • custom_shape_calculators – a dictionary for specifying the user customized shape calculator

Returns

An ONNX model (type: ModelProto) which is equivalent to the input CoreML model

Example of initial types: Assume that ‘A’ and ‘B’ are two root variable names used in the CoreML model you want to convert. We can specify their types via:

from onnxmltools.convert.common.data_types import FloatTensorType
initial_type = [('A', FloatTensorType([40, 12, 1, 1])),
                ('B', FloatTensorType([1, 32, 1, 1]))]
onnxmltools.convert.h2o.convert(model, name=None, initial_types=None, doc_string='', target_opset=None, targeted_onnx='1.10.2', custom_conversion_functions=None, custom_shape_calculators=None)[source]

This function produces an equivalent ONNX model of the given H2O MOJO model. Supported model types: - GBM, with limitations:

  • poisson, gamma, tweedie distributions not supported

  • multinomial distribution supported with 3 or more classes (use binomial otherwise)

Ohter limitations: - modes with categorical splits not supported

Parameters
  • model – H2O MOJO model loaded into memory (see below for example)

  • name – The name of the graph (type: GraphProto) in the produced ONNX model (type: ModelProto)

  • initial_types – a python list. Each element is a tuple of a variable name and a type defined in data_types.py

  • doc_string – A string attached onto the produced ONNX model

  • target_opset – number, for example, 7 for ONNX 1.2, and 8 for ONNX 1.3.

  • targeted_onnx – A string (for example, ‘1.1.2’ and ‘1.2’) used to specify the targeted ONNX version of the produced model. If ONNXMLTools cannot find a compatible ONNX python package, an error may be thrown.

  • custom_conversion_functions – a dictionary for specifying the user customized conversion function

  • custom_shape_calculators – a dictionary for specifying the user customized shape calculator

Returns

An ONNX model (type: ModelProto) which is equivalent to the input xgboost model

Examples

>>> from onnxmltools.convert import convert_h2o
>>> file = open("/path/to/h2o_mojo.zip", "rb")
>>> mojo_content = file.read()
>>> file.close()
>>> h2o_onnx_model = convert_h2o(mojo_content)
onnxmltools.convert.lightgbm.convert(model, name=None, initial_types=None, doc_string='', target_opset=None, targeted_onnx='1.10.2', custom_conversion_functions=None, custom_shape_calculators=None, without_onnx_ml=False, zipmap=True, split=None)[source]

This function produces an equivalent ONNX model of the given lightgbm model. The supported lightgbm modules are listed below.

Parameters
  • model – A LightGBM model

  • initial_types – a python list. Each element is a tuple of a variable name and a type defined in data_types.py

  • name – The name of the graph (type: GraphProto) in the produced ONNX model (type: ModelProto)

  • doc_string – A string attached onto the produced ONNX model

  • target_opset – number, for example, 7 for ONNX 1.2, and 8 for ONNX 1.3.

  • targeted_onnx – A string (for example, ‘1.1.2’ and ‘1.2’) used to specify the targeted ONNX version of the produced model. If ONNXMLTools cannot find a compatible ONNX python package, an error may be thrown.

  • custom_conversion_functions – a dictionary for specifying the user customized conversion function

  • custom_shape_calculators – a dictionary for specifying the user customized shape calculator

  • without_onnx_ml – whether to generate a model composed by ONNX operators only, or to allow the converter

  • zipmap – remove operator ZipMap from the ONNX graph

  • split – this parameter is usefull to reduce the level of discrepancies for big regression forest (number of trees > 100). lightgbm does all the computation with double whereas ONNX is using floats. Instead of having one single node TreeEnsembleRegressor, the converter splits it into multiple nodes TreeEnsembleRegressor, casts the output in double and before additioning all the outputs. The final graph is slower but keeps the discrepancies constant (it is proportional to the number of trees in a node TreeEnsembleRegressor). Parameter split is the number of trees per node. It could be possible to do the same with TreeEnsembleClassifier. However, the normalization of the probabilities significantly reduces the discrepancies.

to use ONNX-ML operators as well. :return: An ONNX model (type: ModelProto) which is equivalent to the input lightgbm model

onnxmltools.convert.xgboost.convert(model, name=None, initial_types=None, doc_string='', target_opset=None, targeted_onnx='1.10.2', custom_conversion_functions=None, custom_shape_calculators=None)[source]

This function produces an equivalent ONNX model of the given xgboost model.

Parameters
  • model – A xgboost model

  • initial_types – a python list. Each element is a tuple of a variable name and a type defined in data_types.py

  • name – The name of the graph (type: GraphProto) in the produced ONNX model (type: ModelProto)

  • doc_string – A string attached onto the produced ONNX model

  • target_opset – number, for example, 7 for ONNX 1.2, and 8 for ONNX 1.3.

  • targeted_onnx – A string (for example, ‘1.1.2’ and ‘1.2’) used to specify the targeted ONNX version of the produced model. If ONNXMLTools cannot find a compatible ONNX python package, an error may be thrown.

  • custom_conversion_functions – a dictionary for specifying the user customized conversion function

  • custom_shape_calculators – a dictionary for specifying the user customized shape calculator

Returns

An ONNX model (type: ModelProto) which is equivalent to the input xgboost model

Utils

onnxmltools.utils.visualize_model(onnx_model, open_browser=True, dest='index.html')[source]

Creates a graph visualization of an ONNX protobuf model. It creates a SVG graph with d3.js and stores it into a file.

Parameters
  • model – ONNX model (protobuf object)

  • open_browser – opens the browser

  • dest – destination file

Example:

from onnxmltools.utils import visualize_model
visualize_model(model)
onnxmltools.utils.dump_data_and_model(data, model, onnx=None, basename='model', folder=None, inputs=None, backend='onnxruntime', context=None, allow_failure=None, verbose=False)[source]

Saves data with pickle, saves the model with pickle and onnx, runs and saves the predictions for the given model. This function is used to test a backend (runtime) for onnx.

Parameters
  • data – any kind of data

  • model – any model

  • onnxonnx model or None to use onnxmltools to convert it only if the model accepts one float vector

  • basemodel – three files are writen <basename>.data.pkl, <basename>.model.pkl, <basename>.model.onnx

  • folder – files are written in this folder, it is created if it does not exist, if folder is None, it looks first in environment variable ONNXTESTDUMP, otherwise, it is placed into 'tests'.

  • inputs – standard type or specific one if specified, only used is parameter onnx is None

  • backend – backend used to compare expected output and runtime output. Two options are currently supported: None for no test, ‘onnxruntime’ to use module onnxruntime.

  • context – used if the model contains a custom operator

  • allow_failure – None to raise an exception if comparison fails for the backends, otherwise a string which is then evaluated to check whether or not the test can fail, example: "StrictVersion(onnx.__version__) < StrictVersion('1.3.0')"

  • verbose – prints more information when it fails

Returns

the created files

Some convention for the name, Bin for a binary classifier, Mcl for a multiclass classifier, Reg for a regressor, MRg for a multi-regressor. The name can contain some flags. Expected outputs refer to the outputs computed with the original library, computed outputs refer to the outputs computed with a ONNX runtime.

  • -CannotLoad: the model can be converted but the runtime cannot load it

  • -Dec3: compares expected and computed outputs up to 3 decimals (5 by default)

  • -Dec4: compares expected and computed outputs up to 4 decimals (5 by default)

  • -NoProb: The original models computed probabilites for two classes size=(N, 2) but the runtime produces a vector of size N, the test will compare the second column to the column

  • -OneOff: the ONNX runtime cannot computed the prediction for several inputs, it must be called for each of them and computed output.

  • -Out0: only compares the first output on both sides

  • -Reshape: merges all outputs into one single vector and resizes it before comparing

  • -SkipDim1: before comparing expected and computed output, arrays with a shape like (2, 1, 2) becomes (2, 2)

If the backend is not None, the function either raises an exception if the comparison between the expected outputs and the backend outputs fails or it saves the backend output and adds it to the results.