- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi ,
I have a retrained tiny yolov3 model with I have converted to Openvino compatible IR models . I am running inference on these models on a laptop running on Intel i7-8750 with NCS2 and a Raspberry Pi3 Running a NCS2 and I am using the C++ APIs to do it.. However I am getting a mismatch in the output nodes byte sizes between the two platforms.
I am using the following snippet of code of read the output nodes blobs and their sizes to create the output buffers.
InferenceEngine::InputsDataMap output_info(network.getOutputsInfo());
for (auto &item : output_info) {
auto output_name = item.first;
std::cout<<"Output name :"<<item.first<<std::endl;
auto output = async_infer_request.GetBlob(output_name);
std::cout<<"output buffer size : "<<output->byteSize()<<std::endl;
output_buffer = output->buffer().as<PrecisionTrait<Precision::FP32>::value_type *>();
}
where network is the MYRIAD plugin loaded with the models .
On the laptop with the NCS2 , I get the following results :
[ INFO ] InferenceEngine:
API version ............ 1.6
Build .................. custom_releases/2019/R1_c9b66a26e4d65bb986bb740e73f58c6e9e84c7c2
[ INFO ] Loading plugin
API version ............ 1.6
Build .................. 22443
Description ....... myriadPlugin
[ INFO ] Loading network files
[ INFO ] Batch size is forced to 1.
[ INFO ] Successfully loaded network files
inputs
inputDims=416 416 3 1
detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
outputDims=1 18 26 26
detector/yolo-v3-tiny/Conv_9/BiasAdd/YoloRegion
outputDims=1 18 13 13
Output data size : 3042
[ INFO ] Loading model to the plugin
[ INFO ] Loaded model to the plugin
[ INFO ] Creating an inference request from the network
[ INFO ] Created an inference request from the network
Input name :inputs
Input buffer size : 519168
Output name :detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
output buffer size : 48672
Output name :detector/yolo-v3-tiny/Conv_9/BiasAdd/YoloRegion
output buffer size : 12168
However , on the raspberry pi3 , I am getting the following :
[ INFO ] InferenceEngine:
API version ............ 1.6
Build .................. 22443
[ INFO ] Loading plugin
API version ............ 1.6
Build .................. 22443
Description ....... myriadPlugin
[ INFO ] Loading network files
[ INFO ] Batch size is forced to 1.
[ INFO ] Successfully loaded network files
inputs
inputDims=416 416 3 1
detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
outputDims=1 18 26 26
detector/yolo-v3-tiny/Conv_9/BiasAdd/YoloRegion
outputDims=1 18 13 13
Output data size : 3042
[ INFO ] Loading model to the plugin
[ INFO ] Loaded model to the plugin
[ INFO ] Creating an inference request from the network
[ INFO ] Created an inference request from the network
Input name :inputs
Input buffer size : 519168
Output name :detector/yolo-v3-tiny/Conv_12/BiasAdd/YoloRegion
terminate called after throwing an instance of 'InferenceEngine::details::InferenceEngineException'
what(): The output blob size is not equal to the network output size: got 12168 expecting 11492
/opt/intel/openvino/deployment_tools/inference_engine/include/details/ie_exception_conversion.hpp:71
Aborted
With the basic maths , the byte size of the output blob on my laptop matches the dimensions of the node 1x18x13x13x4 = 12168 bytes but on raspberry pi3 the network blob size expected in 1x17x13x13x4 = 11492 bytes.
However the same model works fine if I use python API to run inference from my model on the same raspberry pi3 and NCS2 combo. Is there something that I am missing with the C++ API for raspberry pi 3 .
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Anyone ?? any help would be much appreciated as I am at a complete dead end with this ...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Dearest Singh, Anshu,
I'm very sorry that nobody has answered you yet. Well, I know that Raspberry PI is a little behind the rest of the other OpenVinos but I recently filed a bug on tiny yolov3 for this github issue .
TIny Yolo V3 is definitely broken in OpenVino 2019R1.1.
But my guess is that you are using a slightly older version of OpenVino so you have not experienced the breakage yet.
Your code looks fine.
Is it possible for you try your code outside of Raspberry PI - say on an NCS2 just plugged into your Windows laptop ? Please try that and report your findings here. Please stick with OpenVino 2019.R1 (not R1.1) since we know that Tiny Yolo V3 is broken in R1.1.
Thanks,
Shubha

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page