Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.
5770 Discussions

My model is based on mobilenet. The inference time on fpga platform is three times over that of on cpu platform. Why?

li__lang
Beginner
103 Views

testing environment:

    openvino version: 2018 R4

    inference engine: classification_sample

    parameter of running on fpga paltform: -d HETERO:FPGA,CPU (and IR's precision is FP16)

    parameter of running on fpga paltform: -d CPU (and IR's precision is FP32)

    batchsize: 1 to 32 

0 Kudos
0 Replies
Reply