Titan AI LogoTitan AI

onnxruntime

17,875
3,453
C++

Project Description

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

onnxruntime: ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator

Project Title

onnxruntime โ€” Cross-platform, high performance ML inferencing and training accelerator

Overview

ONNX Runtime is a cross-platform inference and training machine-learning accelerator that enables faster customer experiences and lower costs. It supports models from various deep learning frameworks and classical machine learning libraries, and is compatible with different hardware, drivers, and operating systems. ONNX Runtime leverages hardware accelerators where applicable alongside graph optimizations and transforms to provide optimal performance.

Key Features

  • Cross-platform inference and training support
  • Compatibility with multiple deep learning frameworks and classical machine learning libraries
  • Hardware accelerator support and graph optimizations for optimal performance

Use Cases

  • Accelerating model training time on multi-node NVIDIA GPUs for transformer models
  • Enabling faster customer experiences and lower costs by supporting models from various frameworks
  • Leveraging hardware accelerators to optimize performance across different hardware, drivers, and operating systems

Advantages

  • Supports a wide range of deep learning frameworks and classical machine learning libraries
  • Optimizes performance by leveraging hardware accelerators and graph optimizations
  • Compatible with different hardware, drivers, and operating systems

Limitations / Considerations

  • Windows distributions may collect usage data and send it to Microsoft
  • May require additional setup and configuration for specific hardware accelerators

Similar / Related Projects

  • TensorFlow: A popular open-source machine learning framework that also supports model training and inference. ONNX Runtime differs in its focus on cross-platform inference and training acceleration.
  • PyTorch: Another widely-used open-source machine learning framework. ONNX Runtime supports PyTorch models, but is not a framework itself.
  • OpenVINO Toolkit: An open-source toolkit for optimizing and deploying AI models on Intel hardware. ONNX Runtime is more focused on cross-platform support and not limited to a specific hardware vendor.

Basic Information


๐Ÿ“Š Project Information

  • Project Name: onnxruntime
  • GitHub URL: https://github.com/microsoft/onnxruntime
  • Programming Language: C++
  • โญ Stars: 17,578
  • ๐Ÿด Forks: 3,406
  • ๐Ÿ“… Created: 2018-11-10
  • ๐Ÿ”„ Last Updated: 2025-08-20

๐Ÿท๏ธ Project Topics

Topics: [, ", a, i, -, f, r, a, m, e, w, o, r, k, ", ,, , ", d, e, e, p, -, l, e, a, r, n, i, n, g, ", ,, , ", h, a, r, d, w, a, r, e, -, a, c, c, e, l, e, r, a, t, i, o, n, ", ,, , ", m, a, c, h, i, n, e, -, l, e, a, r, n, i, n, g, ", ,, , ", n, e, u, r, a, l, -, n, e, t, w, o, r, k, s, ", ,, , ", o, n, n, x, ", ,, , ", p, y, t, o, r, c, h, ", ,, , ", s, c, i, k, i, t, -, l, e, a, r, n, ", ,, , ", t, e, n, s, o, r, f, l, o, w, ", ]


๐Ÿ“š Documentation

๐ŸŽฅ Video Tutorials


This article is automatically generated by AI based on GitHub project information and README content analysis

Titan AI Explorehttps://www.titanaiexplore.com/projects/onnxruntime-156939672en-USTechnology

Project Information

Created on 11/10/2018
Updated on 9/15/2025