AI Tools from Intel
Find answers to your toolkit installation, configuration, and get-started questions.

OneDNN Compilation failing

liam_murphy
Beginner
685 Views

Not sure this is the right group so apologies in advance. 

 

I’ve  been trying to build the intel mlperf  container from  https://github.com/mlcommons/inference_results_v4.0 [github.com].

but am running into some problems. 

 

I can work around most, except the OpenDNN compile issue at the moment  - Have no idea how to fix those compile errors right now.

 

  • Intel’s conda channel has disappeared.  I modified the docker container to point at that channel and take mkl and openmp from there.

RUN /opt/conda/bin/conda config --add channels https://software.repos.intel.com/python/conda [software.repos.intel.com]

RUN /opt/conda/bin/conda install -y -c https://software.repos.intel.com/python/conda [software.repos.intel.com] mkl==2023.1.0 \

                                          mkl-include==2023.1.0 \

                                          intel-openmp==2023.1.0

 

 

  • Build of LLVM failed. This was caused by CONDA_PREFIX not being defined prior to the LLVM build I added ENV CONDA_PREFIX "/opt/conda" just before ARG PYTORCH_VERSION=v1.12.0

final problem is 

  • OneDNN build fails. Specifically, the code under … / code/bert-99/pytorch-cpu/mlperf_plugins/csrc

Does not compile as it looks like AVX512 types are not known at compile time. See attached file

 

NOTE: I am building the container under WSL

 

 

 

Labels (1)
0 Kudos
1 Solution
Louie_T_Intel
Moderator
298 Views

Hi

Any issue to try on docker images following the document?

feel free to send an email mentioned in the doc for further support.


View solution in original post

0 Kudos
3 Replies
Ying_H_Intel
Employee
535 Views

Hi Liam_murphy,

 

Which Windows version and hardware are you working on?

We haven't tried to build mlperf from WSL. but if possible, would you like to try the mlperf docker image directly?

you may follow the guide: https://www.intel.com/content/www/us/en/developer/articles/guide/get-started-mlperf-intel-optimized-docker-images.html

Ying_H_Intel_0-1728397212279.png

 

and docker pull intel/intel-optimized-pytorch:mlperf-inference-4.1-bert is ready in https://hub.docker.com/r/intel/intel-optimized-pytorch/tags

please feel free to let us know if any result.

 

Thanks

Ying

0 Kudos
Louie_T_Intel
Moderator
299 Views

Hi

Any issue to try on docker images following the document?

feel free to send an email mentioned in the doc for further support.


0 Kudos
liam_murphy
Beginner
265 Views

No problems with the docker image! Thanks

0 Kudos
Reply