- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi! I am developing vision related DNN using OpenVINO toolkit and Caffe.
I post this text because of one question. I am training the same DNN model on several different sets of training data.
Although the DNN model, Model-Optimizer parameters and test platform is always the same, inference speed of real data is different up to 2x depending on the training data.
In my opinion, it always has to be similar using the same environment. What is the reason ?
Link Copied
0 Replies
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page