ONNX Runtime Open Sourced

Recently, at its Connect(); conference 2018, Microsoft has announced the open sourcing of its Open Neural Network Exchange?(ONNX) Runtime?on?GitHub.

Recently, at its Connect(); conference 2018, Microsoft has announced the open sourcing of its Open Neural Network Exchange (ONNX) Runtime on GitHub.
 
ONNX Runtime is a high-performance inference engine for machine learning models in the ONNX format, it can be customized and integrated directly into existing codebases or compiled from source to run on Windows 10, Linux, and a variety of other operating systems. 
 
"With teams using many different training frameworks and targeting different deployment options, there was a real need to unify these scattered solutions to make it quick and simple to operationalize models. ONNX Runtime provides that solution. It gives data scientists the flexibility to train and tune models in the framework of their choice and productionize these models with high performance in products spanning both cloud and edge." wrote Faith Xu, Senior Program Manager, Machine Learning Platform, Microsoft
ONNX Runtime is  open source
Source: Microsoft
 
According to the company, Intel and Microsoft are working together to integrate the nGraph compiler as an execution provider for ONNX Runtime; Nvidia is helping integrate TensorRT; and Qualcomm has expressed its support as well.
  
Recently, leading IoT chip maker NXP also announced support for the service. “When it comes to choosing from among the many machine learning frameworks, we want our customers to have maximum flexibility and freedom, We’re happy to bring the ONNX benefits to our customer community of ML developers by supporting the ONNX Runtime released by Microsoft in our platform.” says Markus Levy, head of the AI Technology Center at NXP.
The release of ONNX Runtime is a significant step towards an open and interoperable ecosystem for AI, the company says.
 
You can read the official announcement here. You can visit ONNX GitHub page here.