FPGA, SoC, And CPLD Boards And Kits
FPGA Evaluation and Development Kits
Announcements
Intel Support hours are Monday-Fridays, 8am-5pm PST, except Holidays. Thanks to our community members who provide support during our down time or before we get to your questions. We appreciate you!

Need Forum Guidance? Click here
Search our FPGA Knowledge Articles here.
5197 Discussions

Inference timing for FPGA longer than expected

Yeoh
Beginner
157 Views

Hi, I am performing classification using 2 models on a single image on the edge CPU, GPU and HETERO:FPGA,CPU aiming to compare the inference speed between this 3 devices. I have calculated the inference timing with reference with the sample codes given.

Since I am evaluating 2 models at one time, I obtained an unexpected result where the inference timing for on FPGA is longest for model1 while lowest for model2.

Initially I suspected it may be the model's problem that affects the inference timing, but when I switched to evaluate model2 first then only model1, the results shows the same where the first evaluated model has a significantly higher inference timing on FPGA where it is expected to be the lowest among the 3 devices. Is there any cause or explanation for this scenario?

 

*For reference I am putting the coding part of inference and also the inference result at below.

Inference part code:

inf_start_1 = time.time()
res = exec_net.infer(inputs={input_blob: images})
inf_time_1 = (time.time() - inf_start_1)*1000
log.info("Inference time Model 1 (ms): {:.3f}".format(inf_time_1))
inf_start_2 = time.time()
res2 = exec_net2.infer(inputs={input_blob2: images})
inf_time_2 = (time.time() - inf_start_2)*1000
log.info("Inference time Model 2 (ms): {:.3f}".format(inf_time_2)) 

 

Inference result:

Yeoh_0-1623663950877.png

Yeoh_1-1623663965913.png

Expected first graph should be having the same trend as the second graph.

0 Kudos
0 Replies
Reply