- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I'm trying to run a fp16 model in gpu. And it assert when call funcition InferencePlugin::LoadNetwork(CNNNetwork network, const std::map<std::string, std::string> &config). Assert message is "program creation failed: Output layout not calculated".
It is ok when run fp32 model in cpu. Could you please guide me as to what may be going wrong.
Link Copied
0 Replies
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page