Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
93 Views

OpenVINO™ has now an option to be used with GStreamer

Jump to solution

 

GstInference is a  GStreamer plugin that enables out-of-the-box integration of deep learning models with GStreamer pipelines for inference tasks. This project is open source, multi platform and now it supports OpenVINO through the ONNX Runtime inference engine. Support for Intel® CPUs, Intel® Integrated Graphics and Intel® MovidiusTM USB sticks is now available.

Check out code samples, documentation and benchmarks for GstInference here.

 

Detection crop exampleDetection crop example

Labels (3)
0 Kudos

Accepted Solutions
Highlighted
Moderator
67 Views

Hi @jchaves 

Thanks for sharing your project with OpenVINO community.
We have also informed developers team about it.

View solution in original post

0 Kudos
2 Replies
Highlighted
Moderator
68 Views

Hi @jchaves 

Thanks for sharing your project with OpenVINO community.
We have also informed developers team about it.

View solution in original post

0 Kudos
Highlighted
Moderator
36 Views

Hi @jchaves 

OpenVINO Toolkit includes DL Streamer which provides GStreamer Video Analytics plugin with elements for Deep Leaning inference using OpenVINO inference engine on Intel CPU, GPU and VPU. For more details on DL Streamer, please refer to the open-source repository here.

Currently, DL Streamer inference elements require models converted to IR format. We have plans to support ONNX models directly without IR conversion in our future version. 

DL Streamer is highly optimized for Intel platforms. We have listed some of the optimizations below: 

  • Optimized interop between media decode, preprocessing and inference
    • Optimal color format conversions
    • Zero-copy buffer sharing between decode, pre-processing and inference on CPU or GPU
  • Asynchronous pipeline execution
  • Optimized multi-stream processing
  • Sharing of IE instances
  • Offloading decode, preprocess to GPU
  • Ability to reduce inference frequency by leveraging object tracking in between inference operations
  • Ability to skip classification on the same object by leveraging object tracking

We will continue to optimize it further and support all the Intel HW. Thank you for your contribution to supporting OpenVINO inference via ONNX RT in GstInference.

0 Kudos