Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Moderator
222 Views

Deep Neural Network extensions for Intel MKL

    Deep neural network (DNN) applications grow in importance in various areas including internet search engines, retail and medical imaging. Intel recognizes importance of these workloads and is developing software solutions to accelerate these applications on Intel Architecture that will become available in future versions of Intel® Math Kernel Library (Intel® MKL) and Intel® Data Analytics Acceleration Library (Intel® DAAL).

While we are working on new functionality we published a series of articles demonstrating DNN optimizations with Caffe framework and AlexNet topology:

Technical details on optimizations we did in the technical previews are available in Intel Lab’s Pradeep Dubey blog:

You can also take a sneak peek at Intel MKL DNN extensions programming model and functionality using Deep Neural Network Technical Preview for Intel® Math Kernel Library (Intel® MKL). The feedback we get with this preview is essential to shape future Intel MKL products.

 

0 Kudos
5 Replies
New Contributor II
222 Views

How do we actually get hold of this program -- I have use?

0 Kudos
222 Views

Hi John,

you might have seen that this was officially released this month: https://software.intel.com/en-us/forums/intel-math-kernel-library/topic/684829

You can find the code on GitHub (https://github.com/01org/mkl-dnn) and more information on the website https://01.org/mkl-dnn

Best

0 Kudos
Beginner
222 Views

Good evening,

After a few hours of trying to fix the CMAKE errors, I have just discovered that it is not possible now to build MKL-DNN for Windows 64-bit.

That should be clearly be specified in the release notes.

Regards,

Jean

0 Kudos
222 Views

Hi,

How can we add new layers to a network (CNN) and see if works with MKL?

Regards,

Tejaswini

0 Kudos
Valued Contributor II
222 Views

>>...While we are working on new functionality we published a series of articles demonstrating DNN optimizations with Caffe >>framework and AlexNet topology... I simply would like to note that when a Neural Network is implemented in a Classic way only ~400 lines of code in C/C++ are needed. This is what I have for a simple pattern recognition software subsystem which is easily portable to Any systems ( Linux, Windows, Android, Embedded, MS-DOS, etc ), with no any dependencies on 3rd party software, and very small. Absolutely No time for all these "monsters" with dependencies on Too many 3rd party libraries, like Caffe, AlexNet, etc...
0 Kudos