I tried running benchmark app for the Mobilenet v1 SSD and it gave me the following latency values for MYRIAD.
Count: 1119 iterations
Duration: 60080.65 ms
Latency: 52.8842 ms
Throughput: 18.91 FPS
However when I check the performance counters (-pc option), I can find the device execution time for all the inference requests is around double the reported latency. I have noticed this for all the classification problems as well.
For Mobilenet v1 SSD, the execution time is 108.26ms
Shouldn't the latency be same as the device execution time ? Please clarify.
Latency in the benchmark app code is just the median of the observed execution times. But I could find the most observed execution time as 108.26 ms and not 52.8 ms.
Dear V B, Anakha,
According to the benchmark_app doc
Reported latency value is calculated as a median value of all collected latencies.
It is not "Latency in the benchmark app code is just the median of the observed execution times." (your original statement)
Also kindly see this note from the same document:
Reported latency value is calculated as a median value of all collected latencies. Reported throughput value is reported in frames per second (FPS) and calculated as a derivative from:
Reported latency in the Sync mode
The total execution time in the Async mode
Please read through this document and feel free to post further questions on this forum.