Community
cancel
Showing results for 
Search instead for 
Did you mean: 
SPaul19
Innovator
88 Views

Getting the inference time for NCS and OpenVINO

Hello. 

In order to present a comparative study between OpenVINO (along with NCS 1) and the standard versions of Caffe and TensorFlow, I was using the standard `time` module of Python in the following way

```python

start_time = time.time()

# inference code for NCS

print("Time taken by NCS:", time.time() - start)

```

I am interested to know if this is the right method to draw such comparison or is there any other method (we had one in NCSDK) specifically suited for this. 

 

Thanks in advance. 

0 Kudos
3 Replies
Shubha_R_Intel
Employee
88 Views

Dear Sayak,

Your approach is solid. In fact if you look at the main.cpp for the classification_sample, you see the following:

 double total = 0.0;
        /** Start inference & calc performance **/
        for (size_t iter = 0; iter < FLAGS_ni; ++iter) {
            auto t0 = Time::now();
            infer_request.Infer();
            auto t1 = Time::now();
            fsec fs = t1 - t0;
            ms d = std::chrono::duration_cast<ms>(fs);
            total += d.count();
        }

std::cout << std::endl << "total inference time: " << total << std::endl;
        std::cout << "Average running time of one iteration: " << total / static_cast<double>(FLAGS_ni) << " ms" << std::endl;
        std::cout << std::endl << "Throughput: " << 1000 * static_cast<double>(FLAGS_ni) * batchSize / total << " FPS" << std::endl;
        std::cout << std::endl;

 

Note that FLAGS_ni is the number of iterations which you pass in through the command-line. Also batchSize is the N in [NCHW], where C = Number of Channels, H = Height, W = Width.

Hope it helps,

Thanks,

Shubha

SPaul19
Innovator
88 Views

I see. I am on the right track then. Just wanted to know if there exists anything specific as NCSDK2. 

Shubha_R_Intel
Employee
88 Views

Dear Sayak,

Keep in mind that NCSDK2 has been replaced by OpenVino. Please use OpenVino from now on.

Thanks !

Shubha

Reply