Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

NCS2 OpenVINO for ARM

JesusE_Intel
Moderator
1,009 Views

This thread is a repost from the Intel Movidius Neural Network Community. Link to original thread.

I have run the samples with NCS2 on Ubuntu16.04 and Raspberry 3B+ successfully. But now I want to try it on other ARM platform, only to find out the essential library files such as libMyriadPlugin.so is in the zip file, which are not source codes.

Could anyone tell me whether Movidius is gonna support other ARM platforms? Maybe like the situation in NCSDK, we can compile the library with the source code.

Appreciate it with your kind help.

0 Kudos
6 Replies
JesusE_Intel
Moderator
1,009 Views

Currently the OpenVINO toolkit is supported on the Raspberry Pi and the hardware listed on the Computer Vision Hardware page.
Please let me know if you have additional questions.

Regards,
Jesus

0 Kudos
BBi
Beginner
1,009 Views

hi, JESUS

i also got a problem when try cross compile openvino toolkit on armv7l rk3288 board with ncs2:

--   SSE4.1 supported
--   SSE4.2 supported
--   SSE4a not supported
--   SSSE3 supported
--   SYSCALL supported
--   TBM not supported
--   XOP not supported
--   XSAVE supported
-- OMP Release lib: OMP_LIBRARIES_RELEASE-NOTFOUND
-- OMP Debug lib: OMP_LIBRARIES_DEBUG-NOTFOUND
CMake Warning at /home/ghostman/Downloads/inference_engine_vpu_arm/deployment_tools/inference_engine/share/InferenceEngineConfig.cmake:31 (message):
  Intel OpenMP not found.  Intel OpenMP support will be disabled.
  IE_THREAD_SEQ is defined
Call Stack (most recent call first):
  /home/ghostman/Downloads/inference_engine_vpu_arm/deployment_tools/inference_engine/share/ie_parallel.cmake:78 (ext_message)
  /home/ghostman/Downloads/inference_engine_vpu_arm/deployment_tools/inference_engine/src/extension/CMakeLists.txt:28 (set_ie_threading_interface_for)


-- Looking for C++ include unistd.h
-- Looking for C++ include unistd.h - found
-- Looking for C++ include stdint.h
-- Looking for C++ include stdint.h - found
-- Looking for C++ include sys/types.h
-- Looking for C++ include sys/types.h - found
-- Looking for C++ include fnmatch.h
-- Looking for C++ include fnmatch.h - found
-- Looking for C++ include stddef.h
-- Looking for C++ include stddef.h - found
-- Check size of uint32_t
-- Check size of uint32_t - done
-- Looking for strtoll
-- Looking for strtoll - found
-- Configuring done
You have changed variables that require your cache to be deleted.
Configure will be re-run and you may have to reset some variables.
The following variables have changed:
CMAKE_CXX_COMPILER= /usr/bin/c++
CMAKE_C_COMPILER= /usr/bin/cc
CMAKE_CXX_COMPILER= /usr/bin/c++

-- The C compiler identification is GNU 5.4.0
-- The CXX compiler identification is GNU 5.4.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- CMAKE_BUILD_TYPE not defined, 'Release' will be used
-- /etc/*-release distrib: Ubuntu 16.04
-- Found InferenceEngine: /home/ghost

why is so? how can i enable Intel OpenMP and make it found by the compiler?

 

0 Kudos
BBi
Beginner
1,009 Views

Dear Intel, could i  know whether the next edition of OpenVino R6 supports ARM+Embeded Linux (except Rasberry Pi)?

0 Kudos
Moran__Emmanuel
Beginner
1,009 Views

Hello!,

I may have a very odd problem here. I'am using NCS1 and NCS2 for the same script, and the NCS1 brings the right inference, and the NCS2 not. 

To say: I was working on ubuntu and raspberry pi3 with raspbian strech, everything ok working with the NCS1 and 2. I decide to go for the NCS2 because of the speed at detection (my problem) with mobilenet_ssd (the typical code with the .caffemodel file. I build some code to use it as a tracking system and everything was ok.

two days ago i was changing some code (no configurations) and the NCS2 had some problems in the detection. It was detecting a person (as VOC dataset was used) in the whole camera view. I asume i did something wrong but the NCS1 was still doing the inference as normal. Then i though the NCS2 broke, but i had a second NCS2 and tested the same code and the problem stays. To resume, the same code, under the same configurations brings to different inferences for the NCS1(correct) and NCS2(incorrect) using the same neuronal network.

BTW: I use the same camera in every test, and it is static. I attach the code used for testing. Only thing that I change is to put the NCS1 and then NCS2.

openvino is used at its version 2018R2, then recently changed to 2019R2 (lastest).

Thanks for any recommendation.

E.M.

0 Kudos
Shubha_R_Intel
Employee
1,009 Views

Dear Moran, Emmanuel,

This forum poster has reported the same thing - that accuracy results of NCS1 are much better than NCS2. I reproduced his problem and filed a bug on it. Hopefully it will be fixed in the next release.

Thanks,

Shubha

 

0 Kudos
Shubha_R_Intel
Employee
1,009 Views

Dear Moran, Emmanuel,

Please check that other post now. There is no difference between NCS1 and NCS2 accuracy. The poster built an incorrect IR, most likely forgot to use --tensorflow_use_custom_operations_config in his MO command. 

Hope it helps,

Thanks,

Shubha

0 Kudos
Reply