Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Ayam
Beginner
130 Views

Intel Vtune is not loading data for general exploartion

Jump to solution

Hello,

I am running an application on Intel Vtune (vtune_amplifier_xe_2013) on the linux system (ubuntu).
However, intel vtune is not load data into the target file.

amplxe: Collection stopped.
amplxe: Using result path `/usr/local/hadoop/r025ge'
amplxe: Executing actions  0 %                                                 
amplxe: Warning: The result contains a lot of raw data. Finalization may take a long time to complete.
amplxe: Executing actions 14 % Loading data files                              
amplxe: Warning: Cannot load data file `/usr/local/hadoop/r025ge/data.0/tbs1546692947.tb6' (tbrw call "TBRW_dobind(tbrwFile->getHandle(), streamIndex)" failed: invalid string (97)).
amplxe: Executing actions 50 % Generating a report 

The result file r025ge is empty.

Thnaks

 

0 Kudos
1 Solution
Bernard
Black Belt
130 Views

>>> when I am working with 10GB of data.>>>

Maybe you are running out of the process heap by  allocating 10GB of data?

Try to run tests with the smaller data sets.

View solution in original post

13 Replies
David_A_Intel1
Employee
130 Views

Hi Maria:

Best thing is to submit an issue at Intel® Premier Support and provide the zipped up results directory.  We would need to examine the files to try to determine what is wrong.  Also, please execute the command 'amplxe-feedback -create-bug-report <report>.zip', where <report> is the filename, including a writable directory path, and attach it to the issue.

Ayam
Beginner
130 Views

@MrAnderson I can not access the link  Intel® Premier Support.  "Cannot connect to the real premier.intel.com".

David_A_Intel1
Employee
130 Views

Hmmm, try http instead of https?

Ayam
Beginner
130 Views

@MrAnderson still the same SSL error.

Bernard
Black Belt
130 Views

Maria M. wrote:

@MrAnderson still the same SSL error.

What is the description of the error?

Have you tried different browsers?

David_A_Intel1
Employee
130 Views

I apologize for the delay.  I got sidetracked. Thanks to ilyapolak for bringing this thread back into my sights. ;)

Actually, I see that Maria has a "student" license for the C++ Studio XE product.  Student licenses do not receive Premier support, only forum support.  As such, support is provided as the support team has available bandwidth.

Let me review this thread, again, and see if I can offer any suggestions.

David_A_Intel1
Employee
130 Views

okay, so, Maria, we need more details about what you are trying to do. Are you using the GUI or command line?  I see your are profiling "hadoop".  Did you have VTune Amplifier XE launch an app?  How did you specify the app?

How long was your profiling run?  Did you set the estimated duration in the project properties?  There is a warning that there is a lot of data.  You should try profiling for less time and see if that works.

 

Bernard
Black Belt
130 Views

>>>Thanks to ilyapolak for bringing this thread back into my sights. ;)>>>

You are welcome.

 

Ayam
Beginner
130 Views

@MrAnderson. thanks

Are you using the GUI or command line?
I am using command line option. "
/opt/intel/vtune_amplifier_xe_2013/bin64/amplxe-cl -collect snb-bandwidth -app-working-dir /usr/local/hadoop -- /usr/local/hadoop/projwc"

Did you have VTune Amplifier XE launch an app?  How did you specify the app?
I have written a simple c code that call the java application. 
#include <iostream> #include <stdio.h> #include <stdlib.h> #include <string.h>

int main(int argc, char* argv[])
   float value;
  FILE  *child = popen("bin/hadoop jar hadoop-*-examples.jar wordcount rand rand-wc10Gb", "r");
    fclose(child);
    return 0;"

How long was your profiling run?
amplxe is giving me Elapsed Time:  3143.350
You should try profiling for less time and see if that works.

I am using wordcount example of Hadoop. If I generate random data of 1GB on HDFS then profiler is working fine. However, I am getting "amplxe: Warning: Cannot load data file `/usr/local/hadoop/r011bw/data.0/tbs646344590.tb6' (tbrw call "TBRW_dobind(tbrwFile->getHandle(), streamIndex)" failed: invalid string (97))." when I am working with 10GB of data.
 

 

Bernard
Black Belt
131 Views

>>> when I am working with 10GB of data.>>>

Maybe you are running out of the process heap by  allocating 10GB of data?

Try to run tests with the smaller data sets.

View solution in original post

David_A_Intel1
Employee
130 Views

Hi Maria:

You don't need a wrapper around the hadoop binary.  You should be able to specify /usr/local/hadoop/bin/hadoop as the application and then specify "jar hadoop-*-examples.jar wordcount rand rand-wc10Gb" as arguments (unless the '*' is being expanded by the command shell).  Is rand-wc10Gb a data file for the app?  Is there a smaller one? :\

Yes, that is a lot of data and time.  You should ensure that the "Duration time estimate" is set to "Over 15 minutes" (in the command line that is the "-target-duration-type" option set to "long", e.g., "-target-duration-type=long").

Ayam
Beginner
130 Views

@iliyapolak ... it worked. thanks alot

@MrAnderson  thanks for your input. I will experiment with your explanation.

 

Bernard
Black Belt
130 Views

@Maria

You are welcome:)

Reply