Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

ONNX Model GenAI C# Sample

RamaSubbuSK
Beginner
1,549 Views

Hi all'

Is openVINO supported for ONNX GenAI Managed ? Is so how do I configure the Execution Provider in the C# code?

 

using Config config = new Config(modelPath);
config.ClearProviders();
config.AppendProvider("cpu");

In the solution add the reference nuget package NuGet Gallery | Intel.ML.OnnxRuntime.OpenVino 1.20.0

 

How to configure and load the VinoProvider?

0 Kudos
3 Replies
Wan_Intel
Moderator
1,447 Views

Dear RamaSubbuSK,

Thank you for reaching out to us.

 

We are currently checking the method to configure Intel.ML.OnnxRuntime.OpenVino in C# code the relevant team. We will get back to you as soon as we have an update. We appreciate your understanding and will keep you updated.

 

 

Regards,

Wan

 

0 Kudos
Wan_Intel
Moderator
1,430 Views

Dear RamaSubbuSK,

 

Currently, we do not have any application or library that uses C# for OpenVINO™ GenAI. We do have OpenVINO™ library that supports Python and C++. On the other hand, OpenVINO™ Execution Provider is the documentation guide on how to code OpenVINO™ with C#, but we cannot guarantee that it works:

 

To configure Intel.ML.OnnxRuntime.OpenVino in C#, follow these steps:

 

  • Install Required Packages. In your C# project, install the required ONNX Runtime with OpenVINO™ Execution Provider via NuGet:

sh

dotnet add package Microsoft.ML.OnnxRuntime.OpenVINO

 

  • Load ONNX Model with OpenVINO™ Execution Provide. Create a C# console application and use the following code to configure OpenVINO™ as the execution provider:

csharp

using System;

using Microsoft.ML.OnnxRuntime;

class Program

{

   static void Main()

   {

       string modelPath = "your_model.onnx";

       // Create an ONNX Runtime session with OpenVINO Execution Provider

       var sessionOptions = new SessionOptions();

       sessionOptions.AppendExecutionProvider_OpenVINO(); // Uses OpenVINO by default

       // Load the model

       using var session = new InferenceSession(modelPath, sessionOptions);

       // Print available execution providers

       Console.WriteLine("Available Execution Providers:");

       foreach (var provider in session.GetExecutionProviderNames())

       {

           Console.WriteLine(provider);

       }

   }

}

 

  • Run Inference Using OpenVINO™. Modify the code to perform inference:

csharp

using System;

using System.Linq;

using Microsoft.ML.OnnxRuntime;

using Microsoft.ML.OnnxRuntime.Tensors;

class Program

{

   static void Main()

   {

       string modelPath = "your_model.onnx";

       var sessionOptions = new SessionOptions();

       sessionOptions.AppendExecutionProvider_OpenVINO(); // Enable OpenVINO EP

       using var session = new InferenceSession(modelPath, sessionOptions);

       // Prepare input (modify shape as per model)

       var inputName = session.InputMetadata.Keys.First();

       var inputShape = session.InputMetadata[inputName].Dimensions;

       float[] inputData = new float[inputShape.Aggregate(1, (a, b) => a * (b > 0 ? b : 1))];

       var inputTensor = new DenseTensor<float>(inputData, inputShape);

       var inputs = new NamedOnnxValue[] { NamedOnnxValue.CreateFromTensor(inputName, inputTensor) };

       // Run inference

       using var results = session.Run(inputs);

       var output = results.First().AsTensor<float>();

       Console.WriteLine("Inference completed. Output size: " + output.Length);

   }

}

 

  • Optimize OpenVINO™ Execution. You can specify OpenVINO™-specific hardware accelerators:

csharp

sessionOptions.AppendExecutionProvider_OpenVINO("GPU"); // Use GPU

sessionOptions.AppendExecutionProvider_OpenVINO("VPU"); // Use Intel VPU

sessionOptions.AppendExecutionProvider_OpenVINO("CPU_FP32"); // Use CPU with FP32 precision

 

For more details on OpenVINO™ execution providers, refer to OpenVINO™ Execution Provider.

 

 

Best regards,

Wan

 

0 Kudos
Wan_Intel
Moderator
1,217 Views

Hi RamaSubbuSK,

This case will no longer be monitored since we have provided suggestion. If you need further assistance, please submit a new question.

 

 

Best regards,

Wan

 

 

0 Kudos
Reply