Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
Beginner
17 Views

5-D Tensor, offloading computations to Tensorflow

We have 5-D tensor in our graph. 5-D operations are not supported by OpenVINO. Specifically, in our case, transpose and  concat.

So we decided to offload the computations to Tensorflow.

We followed the procedure provided at https://software.intel.com/en-us/articles/OpenVINO-ModelOptimizer#offloading-computations-tensorflow

and have managed to generate libtensorflow_call_layer.so.

 

Our application is in python and we are using internal GPU.

We want to know how to provide the extension (libtensorflow_call_layer.so) to the inference engine plugin (IEPlugin).

Thanks in advance,

Sridhar

 

0 Kudos
3 Replies
Highlighted
Beginner
17 Views

Noticed the following statement "The custom layer supports inference only on a CPU, not on Intel® Integrated Graphics or on Intel® FPGA."

What is the plan to support 5-D Tensor or offloading unsupported ops to Tensorflow (while running IE with Integrated Graphics) ?

For CPU, is this correct way of providing extension?

plugin.add_cpu_extension('/home/u1/tensorflow/libtensorflow_call_layer.so')

Thanks,

Sridhar.

 

0 Kudos
Highlighted
17 Views

Hi Sridhar,

To add a custom layer in your application you need to have the following lines of code:

InferencePlugin plugin = PluginDispatcher({"path to plugin directory", ""}).getPluginByDevice("CPU");
IExtensionPtr extension_ptr = make_so_pointer<IExtension>("path to custom layer library");
plugin.AddExtension(extension_ptr);

Will reply back with what the support will be for 5-D tensors and offloading parts of a model to native tensorflow to be ran on Integrated graphics.

Kind Regards,

Monique Jones

0 Kudos
Highlighted
Beginner
17 Views

Hi Monique,

Thanks much for your reply. Can you please provide the python equivalent code for what you have posted ?

We tried a few variants, but could not get it to work.

Best Regards,

Sridhar.

 

0 Kudos