I'm trying to run a fp16 model in gpu. And it assert when call funcition InferencePlugin::LoadNetwork(CNNNetwork network, const std::map<std::string, std::string> &config). Assert message is "program creation failed: Output layout not calculated".
It is ok when run fp32 model in cpu. Could you please guide me as to what may be going wrong.