- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I'm using the Intel MKLML small libraries shipped with MKLDNN [1] to build a custom inference engine for neural networks. It's great to get MKL performance in such a small package.
However it appears that the CNR support functions [2] are not part of this library. I'm currently using an internal function to get the current CNR branch:
extern "C" { int mkl_serv_cbwr_get_auto_branch(); }
Are there plans to add this set of functions? More generally, how are functions selected for MKLML inclusion (for example, vsSin et vsCos are not included either)?
Thanks,
Guillaume
[1] https://github.com/intel/mkl-dnn/releases/download/v0.14/mklml_lnx_2018.0.3.20180406.tgz
[2] https://software.intel.com/en-us/mkl-developer-reference-c-conditional-numerical-reproducibility-control
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Guillaume,
Intel MKL small libraries is a special distribution created to simplify adoption of Intel MKL by open source deep learning frameworks and resolve dependency on SGEMM function in Intel MKL-DNN. The set of functionality included in the small libraries is essentially defined by the functionality used in Caffe, Tensorflow, MXNet and other frameworks.
Intel MKL has the standard way to create a shared object with a subset of the functionality called custom shared object builder. This is the tool used to build Intel MKL small libraries. If you want to tailor the functionality in the library to your needs you can use the custom shared object builder and produce a shared object or a DLL and then redistribute it with your application.
Vadim
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Guillaume,
Intel MKL small libraries is a special distribution created to simplify adoption of Intel MKL by open source deep learning frameworks and resolve dependency on SGEMM function in Intel MKL-DNN. The set of functionality included in the small libraries is essentially defined by the functionality used in Caffe, Tensorflow, MXNet and other frameworks.
Intel MKL has the standard way to create a shared object with a subset of the functionality called custom shared object builder. This is the tool used to build Intel MKL small libraries. If you want to tailor the functionality in the library to your needs you can use the custom shared object builder and produce a shared object or a DLL and then redistribute it with your application.
Vadim
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Oh I did not know about this tool. That's perfect!
Thanks,
Guillaume

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page