Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6404 Discussions

MULTI/HETERO plugins with CPU-only custom layer

aag
Novice
487 Views

Consider the following code:

    IExtensionPtr inPlaceExtension;
    inPlaceExtension = std::make_shared<YoloV3InPlaceExtension>();
    m_ie.AddExtension(inPlaceExtension, "CPU");
    m_network.AddExtension(inPlaceExtension);


    for ( auto it = m_network.begin(); it != m_network.end(); it++ ) {
        auto& layer = *it;
        string affinity = "GPU,CPU";
        if ( layer->type == CUSTOM_YOLOV3DETECTION_OUTPUT_TYPE ) {
            affinity = "CPU";
        }
        m_network.getLayerByName(layer->name.c_str())->affinity = affinity;
    }

I believe this creates a custom layer, and sets its affinity to CPU-only (letting the plugin select a device for the other layers). However, later on either of causes an error:

m_executableNetwork = m_ie.LoadNetwork(m_network, "MULTI:GPU,CPU", {});
m_executableNetwork = m_ie.LoadNetwork(m_network, "HETERO:CPU,GPU", {});

The former fails with

20/02/21-12:11:48.694 E <5548> [root] Exception in loadNetwork into MULTI:CPU,GPU: Unknown Layer Type: Yolov3DetectionOutput

The latter causes the error:

20/02/21-12:07:39.237 E <3532> [root] Exception in loadNetwork into HETERO:CPU,GPU: Network passed to LoadNetwork has affinity assigned, but some layers eg:
(Name:conv0, Type: Convolution) were not assigned to any device.
It might happen if you assigned layers amnually and missed some layers or
if you used some automatic assigning mode which decided that these layers are not
supported by any plugin

So, is there a way for network with CPU-only custom layer work with either MULTI or HETERO plugin?

 

 

 

 

0 Kudos
4 Replies
aag
Novice
487 Views

Update: loading the network into the HETERO plugin works, if not setting the affinities at all. Not the case with MULTI. Any pointers on how to better understand the distinction between the two, and their respective interoperability with custom layers?

0 Kudos
Max_L_Intel
Moderator
487 Views

Hello, Alex A.

Please kindly check replies in the following thread - https://software.intel.com/en-us/forums/intel-distribution-of-openvino-toolkit/topic/815219

Hope this helps.
Thanks.

0 Kudos
aag
Novice
487 Views

So, to follow up: does it mean, that in case of MULTI the expectation is that every layer of the network supports every specified device? So it'd never work with a CPU-only extension, unless only CPU is being used?

0 Kudos
Max_L_Intel
Moderator
487 Views

Hello Alex A.

No, not necessary that it's going to be the case of every layer on every device. MULTI plugin automatically assigns inference requests to available computational devices.
Also, please take into account that not all the layers are supported by all the devices. You could find more information about it here https://docs.openvinotoolkit.org/latest/_docs_IE_DG_supported_plugins_Supported_Devices.html#supported_layers

If some layer is supported by CPU only or some other device that is not presented in your system, so it would be processed by CPU only.

0 Kudos
Reply