Web
Get started with ONNX Runtime Web
Section titled “Get started with ONNX Runtime Web”Contents
Section titled “Contents”Install
Section titled “Install”Use the following command in shell to install ONNX Runtime Web:
# install latest release versionnpm install onnxruntime-web
# install nightly build dev versionnpm install onnxruntime-web@devImport
Section titled “Import”Use the following JavaScript code to import ONNX Runtime Web:
// use ES6 style import syntax (recommended)import * as ort from 'onnxruntime-web';// or use CommonJS style import syntaxconst ort = require('onnxruntime-web');If you want to use ONNX Runtime Web with WebGPU support (experimental feature), you need to import as below:
// use ES6 style import syntax (recommended)import * as ort from 'onnxruntime-web/webgpu';// or use CommonJS style import syntaxconst ort = require('onnxruntime-web/webgpu');If you want to use ONNX Runtime Web with WebNN support (experimental feature), you need to import as below:
// use ES6 style import syntax (recommended)import * as ort from 'onnxruntime-web/experimental';// or use CommonJS style import syntaxconst ort = require('onnxruntime-web/experimental');For a complete table for importing, see Conditional Importing.
Documentation
Section titled “Documentation”See ONNX Runtime JavaScript API{:target=“_blank”} for API reference. Please also check the following links for API usage examples:
-
Tensor - a demonstration of basic usage of Tensor.
-
Tensor <—> Image conversion - a demonstration of conversions from Image elements to and from Tensor.
-
InferenceSession - a demonstration of basic usage of InferenceSession.
-
SessionOptions - a demonstration of how to configure creation of an InferenceSession instance.
-
ort.env flags - a demonstration of how to configure a set of global flags.
-
See also: TypeScript declarations for Inference Session, Tensor, and Environment Flags for reference.
See Tutorial: Web for tutorials.
See Training on web demo for training using onnxruntime-web.
Examples
Section titled “Examples”The following examples describe how to use ONNX Runtime Web in your web applications for model inferencing:
The following are E2E examples that uses ONNX Runtime Web in web applications:
- Classify images with ONNX Runtime Web - a simple web application using Next.js for image classifying.
- ONNX Runtime Web demos for image recognition, handwriting analysis, real-time emotion detection, object detection, and so on.
- OpenAI Whisper - demonstrates how to run whisper tiny.en in your browser using onnxruntime-web and the browser’s audio interfaces.
- Facebook Segment-Anything - demonstrates how to run segment-anything in your browser using onnxruntime-web with webgpu.
The following are video tutorials that use ONNX Runtime Web in web applications:
Supported Versions
Section titled “Supported Versions”| EPs/Browsers | Chrome/Edge (Windows) | Chrome/Edge (Android) | Chrome/Edge (macOS) | Chrome/Edge (iOS) | Safari (macOS) | Safari (iOS) | Firefox (Windows) | Node.js |
|---|---|---|---|---|---|---|---|---|
| WebAssembly (CPU) | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️ | ✔️[1] |
| WebGPU | ✔️[2] | ✔️[3] | ✔️ | ❌ | ❌ | ❌ | ❌ | ❌ |
| WebGL | ✔️[4] | ✔️[4] | ✔️[4] | ✔️[4] | ✔️[4] | ✔️[4] | ✔️[4] | ❌ |
| WebNN | ✔️[5] | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ | ❌ |
- [1]: Node.js only support single-threaded
wasmEP. - [2]: WebGPU requires Chromium v113 or later on Windows. Float16 support requires Chrome v121 or later, and Edge v122 or later.
- [3]: WebGPU requires Chromium v121 or later on Windows.
- [4]: WebGL support is in maintenance mode. It is recommended to use WebGPU for better performance.
- [5]: Requires to launch browser with commandline flag
--enable-features=WebMachineLearningNeuralNetwork.