Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6392 Discussions

fp16 model inference ”memory data type alignment do not match”

Hou_y_1
Beginner
479 Views

Hi

  I converted caffemodel into IR with this command “./ModelOptimizer -p FP16  -w $1 -d $2 -i -b 1 -f 1 ”; Then i run samples with this model, the command line is "./multi_output_sample  -i armstrong_128.bmp -m my_model_fp16.xml -ni 100 -d GPU", but i get a error:

         [ INFO ] Start inference (1 iterations)
          terminate called after throwing an instance of 'std::logic_error'
          what():  memory data type alignment do not match
          Aborted (core dumped)

I see the sample code 

     for (int iter = 0; iter < FLAGS_ni; ++iter) {
            auto t0 = Time::now();
            status = enginePtr->Infer(inputBlobs, outputBlobs, &resp);
            auto t1 = Time::now();
            fsec fs = t1 - t0;
            ms d = std::chrono::duration_cast<ms>(fs);
            total += d.count();

            if (status != InferenceEngine::OK) {
                throw std::logic_error(resp.msg);
            }
        }

        /** Show performace results **/
        std::cout << std::endl << "Average running time of one iteration: " << total / static_cast<double>(FLAGS_ni) << " ms" << std::endl;

the error shows  failure to infer, how can i save this problem?

0 Kudos
1 Reply
Anna_B_Intel
Employee
479 Views

Hello,

Does it work if you run this model on CPU?

We can help you if you share your model with us. Is it possible?

Best wishes, 

Anna

0 Kudos
Reply