- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi all'
Is openVINO supported for ONNX GenAI Managed ? Is so how do I configure the Execution Provider in the C# code?
using Config config = new Config(modelPath);
config.ClearProviders();
config.AppendProvider("cpu");
In the solution add the reference nuget package NuGet Gallery | Intel.ML.OnnxRuntime.OpenVino 1.20.0
How to configure and load the VinoProvider?
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear RamaSubbuSK,
Thank you for reaching out to us.
We are currently checking the method to configure Intel.ML.OnnxRuntime.OpenVino in C# code the relevant team. We will get back to you as soon as we have an update. We appreciate your understanding and will keep you updated.
Regards,
Wan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear RamaSubbuSK,
Currently, we do not have any application or library that uses C# for OpenVINO™ GenAI. We do have OpenVINO™ library that supports Python and C++. On the other hand, OpenVINO™ Execution Provider is the documentation guide on how to code OpenVINO™ with C#, but we cannot guarantee that it works:
To configure Intel.ML.OnnxRuntime.OpenVino in C#, follow these steps:
- Install Required Packages. In your C# project, install the required ONNX Runtime with OpenVINO™ Execution Provider via NuGet:
sh
dotnet add package Microsoft.ML.OnnxRuntime.OpenVINO
- Load ONNX Model with OpenVINO™ Execution Provide. Create a C# console application and use the following code to configure OpenVINO™ as the execution provider:
csharp
using System;
using Microsoft.ML.OnnxRuntime;
class Program
{
static void Main()
{
string modelPath = "your_model.onnx";
// Create an ONNX Runtime session with OpenVINO Execution Provider
var sessionOptions = new SessionOptions();
sessionOptions.AppendExecutionProvider_OpenVINO(); // Uses OpenVINO by default
// Load the model
using var session = new InferenceSession(modelPath, sessionOptions);
// Print available execution providers
Console.WriteLine("Available Execution Providers:");
foreach (var provider in session.GetExecutionProviderNames())
{
Console.WriteLine(provider);
}
}
}
- Run Inference Using OpenVINO™. Modify the code to perform inference:
csharp
using System;
using System.Linq;
using Microsoft.ML.OnnxRuntime;
using Microsoft.ML.OnnxRuntime.Tensors;
class Program
{
static void Main()
{
string modelPath = "your_model.onnx";
var sessionOptions = new SessionOptions();
sessionOptions.AppendExecutionProvider_OpenVINO(); // Enable OpenVINO EP
using var session = new InferenceSession(modelPath, sessionOptions);
// Prepare input (modify shape as per model)
var inputName = session.InputMetadata.Keys.First();
var inputShape = session.InputMetadata[inputName].Dimensions;
float[] inputData = new float[inputShape.Aggregate(1, (a, b) => a * (b > 0 ? b : 1))];
var inputTensor = new DenseTensor<float>(inputData, inputShape);
var inputs = new NamedOnnxValue[] { NamedOnnxValue.CreateFromTensor(inputName, inputTensor) };
// Run inference
using var results = session.Run(inputs);
var output = results.First().AsTensor<float>();
Console.WriteLine("Inference completed. Output size: " + output.Length);
}
}
- Optimize OpenVINO™ Execution. You can specify OpenVINO™-specific hardware accelerators:
csharp
sessionOptions.AppendExecutionProvider_OpenVINO("GPU"); // Use GPU
sessionOptions.AppendExecutionProvider_OpenVINO("VPU"); // Use Intel VPU
sessionOptions.AppendExecutionProvider_OpenVINO("CPU_FP32"); // Use CPU with FP32 precision
For more details on OpenVINO™ execution providers, refer to OpenVINO™ Execution Provider.
Best regards,
Wan
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi RamaSubbuSK,
This case will no longer be monitored since we have provided suggestion. If you need further assistance, please submit a new question.
Best regards,
Wan

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page