Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6594 Discussions

yolov3_tiny_tf run_inference_stream problem

fpga_guyx
Beginner
378 Views

i have completed successfully Arria 10 SoC demo project resnet-50-tf on Arria 10 SoC devkit. (my tool version intel fpga ai suite 2025.1 and open vino 2024.6). i have used the precompile arria10 wic image. 

Arria 10 SoC devkit:

https://www.altera.com/products/devkit/a1jui0000049utgmam/arria-10-sx-soc-development-kit

SoC Demo project:

https://www.intel.com/content/www/us/en/docs/programmable/848957/2025-1/soc-design-example-prerequisites.html

Then, i have compiled yolo_v3_tiny_tf model with no folding and device fpga, cpu to obtain .bin file. When i run the  ./run_inference_stream.sh, it get this error:

root@arria10:~/app# ./run_inference_stream.sh
Runtime version check is enabled.
[ INFO ] Architecture used to compile the imported model: A10_Performance
Using licensed IP
Read hash from bitstream ROM...
Read build version string from bitstream ROM...
Read arch name string from bitstream ROM...
Runtime arch check is enabled. Check started...
Runtime arch check passed.
Runtime build version check is enabled. Check started...
Runtime build version check passed.
Exception from src/inference/src/cpp/core.cpp:184:
Exception from src/inference/src/dev/plugin.cpp:73:
Exception from src/inference/src/dev/plugin.cpp:73:
Exception from src/plugins/intel_cpu/src/utils/serialize.cpp:145:
[CPU] Could not deserialize by device xml header.

 

How can i solve this problem? Thank you.

 

Note:

root@arria10:~/app# ls
build_os.txt libopenvino_auto_batch_plugin.so
build_version.txt libopenvino_auto_plugin.so
categories.txt libopenvino_c.so
dla_benchmark libopenvino_c.so.2024.6.0
hetero_plugin libopenvino_c.so.2460
image_streaming_app libopenvino_ir_frontend.so
libcoreDLAHeteroPlugin.so libopenvino_ir_frontend.so.2024.6.0
libcoreDlaRuntimePlugin.so libopenvino_ir_frontend.so.2460
libformat_reader.so libopenvino_jax_frontend.so
libhps_platform_mmd.so libopenvino_jax_frontend.so.2024.6.0
libopencv_core.so.4.8.0 libopenvino_jax_frontend.so.2460
libopencv_core.so.408 libopenvino_pytorch_frontend.so
libopencv_highgui.so.4.8.0 libopenvino_pytorch_frontend.so.2024.6.0
libopencv_highgui.so.408 libopenvino_pytorch_frontend.so.2460
libopencv_imgcodecs.so.4.8.0 libopenvino_template_extension.so
libopencv_imgcodecs.so.408 libopenvino_tensorflow_lite_frontend.so
libopencv_imgproc.so.4.8.0 libopenvino_tensorflow_lite_frontend.so.2024.6.0
libopencv_imgproc.so.408 libopenvino_tensorflow_lite_frontend.so.2460
libopencv_videoio.so.4.8.0 plugins.xml
libopencv_videoio.so.408 results.txt
libopenvino.so run_image_stream.sh
libopenvino.so.2024.6.0 run_inference_stream.sh
libopenvino.so.2460 streaming_inference_app
libopenvino_arm_cpu_plugin.so

0 Kudos
2 Replies
Wan_Intel
Moderator
351 Views

Hi FPGA_GUYX,

Thank you for reaching out to OpenVINO™ community!

 

For your information, we received an update from the relevant team as follows:

FPGA AI Suite are no longer under Intel®, they are under a separate company, Altera®.

Therefore, we request you to post your queries in Altera® Community.

 

 

Regards,

Wan


0 Kudos
Wan_Intel
Moderator
132 Views

Hi FPGA_GUYX,

Thank you for your question.


If you need additional information from Intel, please submit a new question as this thread will no longer be monitored.



Regards,

Wan


0 Kudos
Reply