Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
18 Views

Openvino assert when loading fp16 model in gpu

I'm trying to run a fp16 model in gpu. And it assert when call funcition InferencePlugin::LoadNetwork(CNNNetwork network, const std::map<std::string, std::string> &config). Assert message is "program creation failed: Output layout not calculated". 

It is ok when run fp32 model in cpu. Could you please guide me as to what may be going wrong.

0 Kudos
0 Replies