- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello folks!
I'm developing an application in c, and using the inference engine provided by the openvino.
I found that, the inference process is very fast, but the bottleneck is creating the inferRequest. It took about 80ms on my computer.
ExecutableNetwork executable_network = plugin.LoadNetwork(network, {});
InferRequest infer_request = executable_network.CreateInferRequest();
since my application is in c. the application calls the c++ API(dll) many times. The overhead is too huge.
is it possible to new the infer_request such that I don't have to create the inferRequest everything I call the c++ interface ?
thanks!
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dear ben, ben,
Please consider these Inference Engine performance topics (such as Async API and Throughput Mode), mentioned in the below document:
http://docs.openvinotoolkit.org/latest/_docs_IE_DG_Intro_to_Performance.html
Thanks,
Shubha
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page