- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi
How Can I Use ExecutableNetwork to export model to disk and then use InferencePlugin ImportNetwork?
When I use ExecutableNetwork export API, it crash.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi ISS,
I am trying to understand your question, are you asking how to export the model of ExecutableNetwork?
For example, in our document which shows the command line of converting CaffeeModel, it like this:
# python3 mo.py --input_model model-file.caffemodel
Here: model-file.caffemodel is the model file that export from Caffee Network, are you asking to general the similar file for ExecutableNetwork?
Mark
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Mark,
I am using Intel CV SDK in C++ API. And i have convert caffemodel to produce an optimized Intermediate Representation (IR). *.xml and *.bin.
Now, I want to use the C++ API in ie_executable_network.hpp head file:
/**
* @brief Exports the current executable network so it can be used later in the Import() main API
* @param modelFileName Full path to the location of the exported file
* @param resp Optional: pointer to an already allocated object to contain information in case of failure
*/
void Export(const std::string &modelFileName) {
CALL_STATUS_FNC(Export, modelFileName);
}
To Export the optimized Intermediate Representation (IR) to a single file to disk. But it no work.
Thanks.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
sss, lss wrote:Hi Mark,
I am using Intel CV SDK in C++ API. And i have convert caffemodel to produce an optimized Intermediate Representation (IR). *.xml and *.bin.
Now, I want to use the C++ API in ie_executable_network.hpp head file:
/**
* @brief Exports the current executable network so it can be used later in the Import() main API
* @param modelFileName Full path to the location of the exported file
* @param resp Optional: pointer to an already allocated object to contain information in case of failure
*/
void Export(const std::string &modelFileName) {
CALL_STATUS_FNC(Export, modelFileName);
}To Export the optimized Intermediate Representation (IR) to a single file to disk. But it no work.
Thanks.
save thing happens to me.
However, the Export function is used in the speech_sample.
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page