GstInference is a GStreamer plugin that enables out-of-the-box integration of deep learning models with GStreamer pipelines for inference tasks. This project is open source, multi platform and now it supports OpenVINO through the ONNX Runtime inference engine. Support for Intel® CPUs, Intel® Integrated Graphics and Intel® MovidiusTM USB sticks is now available.
Check out code samples, documentation and benchmarks for GstInference here.
OpenVINO Toolkit includes DL Streamer which provides GStreamer Video Analytics plugin with elements for Deep Leaning inference using OpenVINO inference engine on Intel CPU, GPU and VPU. For more details on DL Streamer, please refer to the open-source repository here.
Currently, DL Streamer inference elements require models converted to IR format. We have plans to support ONNX models directly without IR conversion in our future version.
DL Streamer is highly optimized for Intel platforms. We have listed some of the optimizations below:
We will continue to optimize it further and support all the Intel HW. Thank you for your contribution to supporting OpenVINO inference via ONNX RT in GstInference.