Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Slow loading time to run OpenVino model on Movidius Stisk 2

nnain1
New Contributor I
892 Views

My application uses two networks. The first network uses MobileNetV2 SSDLite and the second network uses MobileNetV1 FRCNN.

The first network scans the image and detect something, then the second network zoom into that small area.

I tested on my system with CPU Intel Atom E3950 and GPU Intel® Gen 9 HD.

GPU(Async mode for image capturing) produces 5 fps
CPU(Async mode for image capturing) produces 3 fps

I can't run on Neuro stick 2. Takes very long time to load and stuck at

API version ............ 1.6
    Build .................. 22443
    Description ....... myriadPlugin
[ INFO ] Loading network files:
    /home/upsquared/NumberPlate/recognition/frcnn_mobilenet_v1_0.5/OpenvinoModel_2019/fp16/openvino_frcnn_mobilenetv1.xml
    /home/upsquared/NumberPlate/recognition/frcnn_mobilenet_v1_0.5/OpenvinoModel_2019/fp16/openvino_frcnn_mobilenetv1.bin
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs

 

What could be the problem?

I can run the first network alone. Just that the application with two networks loading takes very long time and stuck at "Preparing output blobs"

0 Kudos
5 Replies
Shubha_R_Intel
Employee
892 Views

Dear naing, nyan,

Thank you for first testing on the CPU and GPU. This seems to be a real issue.  I will PM you a place to drop your files so that I can investigate it further.

Thanks,

Shubha

0 Kudos
nnain1
New Contributor I
892 Views

hi Shubha,

I have shared my models. May I know what are the issues for extremely slow loading?

Regardss

0 Kudos
Shubha_R_Intel
Employee
892 Views

Dear naing, nyan,

Yes I got your *.zip files over PM. Sorry that it has taken so long. To be honest i haven't gotten to it yet. But if you are in a hurry, I have a suggestion for you.  Build a DEBUG version of inference Engine using the Open Source OpenVino . Follow this readme. The entire Inference Engine source code is there and you can see for yourself exactly why the model is so slow to load. You will find your answer.

Make sure that you re-generate your IR using the Open Source DLDT, don't use the Model Optimizer from the Release package. The reason is that sometimes the MO from the Release and the IE from the DLDT get out of sync so it's better to just stick with one package for both generating IR and Inference.

The steps I've given you here were exactly what I was going to do Nyan, since off the top of my head - I do not know the answer. I will tell you however,  that optimization happens during model loading - and this could be what's taking forever. But honestly, I am not sure. I will not know til I debug myself.

Shubha

0 Kudos
nnain1
New Contributor I
892 Views

Thanks Shubha. I'll look into that.

0 Kudos
nnain1
New Contributor I
892 Views

Dear Shubha,

I followed the instruction here

I am building in Ubuntu with GCC/G++ and Cmake.
When I reached to

cmake -DCMAKE_BUILD_TYPE=Debug ..

I have error as

 

fatal: Not a git repository (or any of the parent directories): .git
CMake Error at thirdparty/fluid/modules/gapi/cmake/standalone.cmake:2 (find_package):
  By not providing "Findade.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "ade", but
  CMake did not find one.

  Could not find a package configuration file provided by "ade" (requested
  version 0.1.0) with any of the following names:

    adeConfig.cmake
    ade-config.cmake

  Add the installation prefix of "ade" to CMAKE_PREFIX_PATH or set "ade_DIR"
  to a directory containing one of the above files.  If "ade" provides a
  separate development package or SDK, be sure it has been installed.
Call Stack (most recent call first):
  thirdparty/fluid/modules/gapi/CMakeLists.txt:4 (include)


-- Configuring incomplete, errors occurred!
See also "/home/upsquared/NumberPlate/dldt-2019/inference-engine/build/CMakeFiles/CMakeOutput.log".
See also "/home/upsquared/NumberPlate/dldt-2019/inference-engine/build/CMakeFiles/CMakeError.log".

How can I fix the error?

0 Kudos
Reply