- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I want OpenVINO’s inference engine to load models from memory (not from a file path).
For this I am using CNNNetwork ReadNetwork(const std::string& model, const Blob::CPtr& weights) function.
I need help regarding how can i create Blob::CPtr weights from weights.bin file. I am using OpenVINO 2021.4 version and OS is Ubuntu 20.04 LTS.
Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Subtle,
You can create Blob objects using constructors with InferenceEngine::TensorDesc. You can refer to Inference Memory Primitives (C++) for more information.
Besides, I also found that someone shared his way to load model from memory in the previous post which may be helpful for you.
Regards,
Peh
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Subtle,
You can create Blob objects using constructors with InferenceEngine::TensorDesc. You can refer to Inference Memory Primitives (C++) for more information.
Besides, I also found that someone shared his way to load model from memory in the previous post which may be helpful for you.
Regards,
Peh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Peh,
Thanks for reply.
I am creating Blob objects using constructor TensorDesc(const Precision& precision, const SizeVector& dims, Layout layout) . For my use case i need to keep Precision::UNSPECIFIED and Layout::ANY, I want to understand more about const SizeVector& dims, what should i keep for this parameter, if i am keeping Layout::ANY.
Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Subtle,
There is nothing special for the layout set as “ANY”. The parameters for the dimension still need to set based on the model in the supported format (InferenceEngine::Layout::NCDHW, InferenceEngine::Layout::NCHW, InferenceEngine::Layout::NC, InferenceEngine::Layout::C).
Regards,
Peh
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Peh,
Thanks a lot for reply.
I got the point.
Thanks & Regards
Subtle
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Subtle,
This thread will no longer be monitored since we have provided answers. If you need any additional information from Intel, please submit a new question.
Regards,
Peh
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page