Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Highlighted
80 Views

DPCT Responds: Migration not necessary

Jump to solution

I am attempting to migrate a series of CUDA files pertaining to an application using DPCT in Beta08. When calling dpct; I see it processes CUDA files and generates some benign warnings but at the end it exits without writing out any DPC++ equivalent file. I can clearly see CUDA functions called in these applications and removal of CUDA path would fail the compile process. This means without proper migration, in DPC++ space, still path to a locally installed CUDA library is needed which defeats the purpose of our migration methodology. I was wondering how I could get more detail about the reason for this behavior and if there is any issue how to proceed.

This is on Ubuntu. In below snapshot, I have eliminated the actual physical paths to files to avoid confusion:

 

$ dpct --report-type=all --cuda-include-path=/usr/local/cuda-10.2/include -p compile_commands.json

 

Processing: ....../LoadBalancerGPU.cu
Processing: ....../ComputeThermoGPU.cu
Processing: ....../CommunicatorGPU.cu
Processing: ....../ParticleData.cu
Processing: ....../Integrator.cu
------------------APIS report--------------------
API name Frequency
-------------------------------------------------
----------Stats report---------------

File name, LOC migrated to DPC++, LOC migrated to helper functions, LOC not needed to migrate, LOC not able to migrate
....../Integrator.cu, 1, 0, 168, 0
....../ParticleData.cu, 1, 0, 402, 0
....../ComputeThermoGPU.cu, 1, 0, 686, 0
....../ParticleGroup.cu, 6, 0, 111, 0

Total migration time: 17207.371000 ms

-------------------------------------
dpct exited with code: 1 (Migration not necessary)

Labels (2)
0 Kudos

Accepted Solutions
Highlighted
33 Views

Hello again,

I did some tweaking around with your suggestion and I think I have a solution. I can get dpct to traverse the hierarchy and find files mentioned in the compile_commands.json using below command:

dpct -p compile_commands.json --in-root=src --out-root=dpct_out --process-all

 

The difference here is that I am only specifying the top level directory to my hierarchy and only include files I am interested in the compile_commands.json. Then the tool does its magic! 

 

You can go ahead and close this thread. thanks for your help.

Farshad.

View solution in original post

0 Kudos
6 Replies
Highlighted
Moderator
61 Views

Hi,


There could be multiple reasons for this kind of behavior.


If possible, could you attach your compile_commands.json file, Makefile and one of the CUDA source files that had failed to migrate (with this warning)?


From the report I could see that "LOC migrated to helper functions" is zero for all CUDA files (strange!)



Thanks,

Rahul


0 Kudos
Highlighted
54 Views

Hi Rahul,

 

before sending the files you requested over, I ran an experiment to make sure we will see the same things after the transfer. I took a file and ran it through the tool and noticed I would get an output file regardless of any migration. All four commands below generated individual output files in the dpct_output directory:

dpct --report-type=all --cuda-include-path=/usr/local/cuda-10.2/include -p compile_commands.json /home/farshad/proj/hoomd-blue/hoomd/LoadBalancerGPU.cu

$ dpct --report-type=all --cuda-include-path=/usr/local/cuda-10.2/include -p compile_commands.json /home/farshad/proj/hoomd-blue/hoomd/CommunicatorGPU.cu

$ dpct --report-type=all --cuda-include-path=/usr/local/cuda-10.2/include -p compile_commands.json /home/farshad/proj/hoomd-blue/hoomd/ParticleData.cu

$ dpct --report-type=all --cuda-include-path=/usr/local/cuda-10.2/include -p compile_commands.json /home/farshad/proj/hoomd-blue/hoomd/Integrator.cu

 

All above files are in the compile_commands.json compilation database. However, if I just run the dpct without specifying any particular file, I do not get anything. Below command fails to migrate or produce any file in the dpct_output directory:

dpct --report-type=all --cuda-include-path=/usr/local/cuda-10.2/include -p compile_commands.json

 

I am attaching the compilation database along with the four files I have used for this exercise. Please note that I do not have the option of running dpct for each individual file. In the real compile_commands.json file, there is in excess of 200 files! So, We need to identify an scale-able solution.

thanks

Farshad.

0 Kudos
Highlighted
Moderator
45 Views

Hi,

 

I totally agree, specifying one file at a time is pointless.

 

Generally, if you do not specify the files that you wish to migrate, it should automatically migrate all the files and their dependencies as specified by the json file. As specified by your Makefile, DPCT will always check for a driver code (with main function), utilizing these helper .cu files to generate an exe file. If the driver code is present, it will also get included in the json file (but doesn't get migrated), when you execute "intercept-build make". Subsequently, all the helper .cu files that are required by this driver code will get migrated.

 

In your case, due to the absence of the driver code(containing main function), DPCT thinks that since these helper .cu files are not being utilized anyway, there is no need to perform migration on these files. That's the reason you see "Migration not necessary" warning.

 

To overcome this problem, my suggestion would be to use the below command for migration. (Assuming that your source files are present in "src" directory)

 

 

dpct -p compile_commands.json --in-root=. --out-root=dpct_out --process-all src/*.cu src/*.cpp

 

 

This command will ensure that, regardless of whether the files require migration or not, all the files (ending with .cu and .cpp) get migrated to --out-root directory.

 

Hope this helps.

 

Regards,

Rahul

 

0 Kudos
Highlighted
38 Views

Hi Rahul,

 

your proposed solution works to certain extend. I had to modify it as below:

dpct -p compile_commands.json --in-root=src --out-root=dpct_out --process-all src/*.cu src/*.cpp

 

However, I see two problems here:

1) above command migrates every CUDA and CPP files in the "src" directory even though they may not be mentioned in the compile_commands.json. I believe this is the artifact of the "--process-all" command line. Obviously it fails miserably for those files not represented in compile_commands.json since they are not represented with valid command line options.

2) above command does not traverse through the hierarchy. In the main "src" directory, there are only CUDA files. But there are hierarchies with more CUDA and CPP files. Is there a trick to have it traverse through the hierarchy for all mentioned file types? Again, collecting all files and putting them in one directory is an ugly solution since we are dealing with a massive database here.

Thanks,

Farshad.

 

0 Kudos
Highlighted
34 Views

Hello again,

I did some tweaking around with your suggestion and I think I have a solution. I can get dpct to traverse the hierarchy and find files mentioned in the compile_commands.json using below command:

dpct -p compile_commands.json --in-root=src --out-root=dpct_out --process-all

 

The difference here is that I am only specifying the top level directory to my hierarchy and only include files I am interested in the compile_commands.json. Then the tool does its magic! 

 

You can go ahead and close this thread. thanks for your help.

Farshad.

View solution in original post

0 Kudos
Highlighted
Moderator
23 Views

Hi,


That's right, --in-root directory with --process-all flag migrates all the files recursively if they are present in json file. I thought you also wanted to migrate other files, which are not a part of json.


Thanks for the confirmation.

Intel will no longer monitor this thread. However, this thread will remain open for community participation.


0 Kudos