- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello !
With the Reshape API (https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_ShapeInference.html), it is possible to specify new input shapes in the runtime. For the pretrained model human-pose-estimation-0001, it works well : by specifying smaller input shape, it is possible to run faster inferences.
My question is : instead of using the API, is there a method to directly and statically modify the Intermediate Representation files (.xml and .bin) in order to do the reshaping ? Currently, I don't have the original model that was used to generate human-pose-estimation-0001.xml and human-pose-estimation-0001.bin files, so rerunning the model optimizer with new input shape is not an option.
Thank you !
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Greetings,
The primary method of the reshape is InferenceEngine::CNNNetwork::reshape. It gets new input shapes and propagates it from input to output for all intermediates layers of the given network. The method takes InferenceEngine::ICNNNetwork::InputShapes - a map of pairs: name of input data and its dimension.
The algorithm for resizing network is the following:
1) Collect the map of input names and shapes from Intermediate Representation (IR) using helper method InferenceEngine::CNNNetwork::getInputShapes
2) Set new input shapes
3) Call reshape
You may refer this official documentation for further information:
https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_ShapeInference.html
Sincerely,
Iffa
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Iffa_Intel Thank you for your reply but you don't really answer to my question. I already knew about https://docs.openvinotoolkit.org/latest/openvino_docs_IE_DG_ShapeInference.html since this link was already in my initial post.
Nevermind, I have found a solution : just call InferenceEngine::CNNNetwork::serialize after the call to reshape, in order to save the reshaped network into xml and bin files.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Glad to hear that!
Intel will no longer monitor this thread since this issue has been resolved. If you need any additional information from Intel, please submit a new question.
Sincerely,
Iffa
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page