.. _l-tutorials:
Tutorials
=========
.. contents::
:local:
ONNX ecosystem
++++++++++++++
Following tutorials introduce the :epkg:`ONNX` ecosystem. It walk the
user through the ONNX specficiations, how to execute an ONNX graph,
how to create an ONNX graph, how to convert a model from :epkg:`scikit-learn`,
and how to train them with :epkg:`onnxruntime-training`.
.. toctree::
:maxdepth: 2
tutorial_onnx/index
tutorial_onnxruntime/index
tutorial_skl/index
tutorial_training/index
tutorial_bench/index
tutorial_parallel/index
Readings
++++++++
* `Add AI to mobile applications with Xamarin and ONNX Runtime
`_
* `Announcing ONNX Runtime Availability in the NVIDIA Jetson Zoo for High Performance Inferencing
`_
(8/2021)
* `Speeding Up Deep Learning Inference Using TensorFlow, ONNX, and NVIDIA TensorRT
`_
(7/2021)
* `Journey to optimize large scale transformer model inference with ONNX Runtime
`_
(6/2021)
* `Accelerating Model Training with the ONNX Runtime
`_
(5/2020)
* `Accelerate and simplify Scikit-learn model inference with ONNX Runtime
`_
(12/2020)
* `Model Persistence scikit-learn and ONNX
`_,
short talk at `scikit-learn foundation `_
(2019)
Current documention of ONNX and onnxruntime
+++++++++++++++++++++++++++++++++++++++++++
Most of the documentation related on :epkg:`onnx` and :epkg:`onnxruntime`
is written on :epkg:`markdown`. The following section is an attempt
to render it and make it searchable.
.. toctree::
:maxdepth: 2
onnxmd/index
Build
+++++
Some useful pages.
* :ref:`Build onnxruntime on WSL (Windows Linux Subsystem) (2021) `.