- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi all,
I am looking for a tutorial / code snippets to create a custom extension and custom layer for Inference Engine. I want to run custom layer on the device other than CPU. I tried following hello_shape_infer_ssd sample code, but couldn't make it through. Is there any other implementation for the same ? (any CPP or Python code implementation)
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Please study deployment_tools\model_optimizer\extensions\front\caffe\argmax_ext.py
and also deployment_tools\model_optimizer\extensions\ops\argmax.py
and also deployment_tools\inference_engine\src\extension\ext_argmax.cpp
Also look at http://caffe.berkeleyvision.org/tutorial/layers/argmax.html code.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Additional tutorial for custom layers can be found here: https://github.com/david-drew/OpenVINO-Custom-Layers

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page