Application Acceleration With FPGAs
Programmable Acceleration Cards (PACs), DCP, DLA, Software Stack, and Reference Designs
Announcements
Intel Support hours are Monday-Fridays, 8am-5pm PST, except Holidays. Thanks to our community members who provide support during our down time or before we get to your questions. We appreciate you!

Need Forum Guidance? Click here
Search our FPGA Knowledge Articles here.
422 Discussions

Is there any a little detail source code of the loadnetwork in Inference engine

YJing8
Novice
517 Views

"ExecutableNetwork Core::LoadNetwork(CNNNetwork network, const std::string & deviceName, const std::map<std::string, std::string> & config)"

The return of this function is "_impl->GetCPPPluginByName(deviceName_).LoadNetwork(network, config_);"

Is there any a little detail source code of the loadnetwork in Inference engine

0 Kudos
1 Reply
JohnT_Intel
Employee
160 Views

Hi,

 

It is used to load the network for you application. You may refer to https://docs.openvinotoolkit.org/latest/classInferenceEngine_1_1InferencePlugin.html#a0ca00d832aa35e... for each of the source code information.

Reply