Analyzers
Talk to fellow users of Intel Analyzer tools (Intel VTune™ Profiler, Intel Advisor)
5133 Discussions

Intel Vtune is not loading data for general exploartion

Ayam
Beginner
1,000 Views

Hello,

I am running an application on Intel Vtune (vtune_amplifier_xe_2013) on the linux system (ubuntu).
However, intel vtune is not load data into the target file.

amplxe: Collection stopped.
amplxe: Using result path `/usr/local/hadoop/r025ge'
amplxe: Executing actions  0 %                                                 
amplxe: Warning: The result contains a lot of raw data. Finalization may take a long time to complete.
amplxe: Executing actions 14 % Loading data files                              
amplxe: Warning: Cannot load data file `/usr/local/hadoop/r025ge/data.0/tbs1546692947.tb6' (tbrw call "TBRW_dobind(tbrwFile->getHandle(), streamIndex)" failed: invalid string (97)).
amplxe: Executing actions 50 % Generating a report 

The result file r025ge is empty.

Thnaks

 

0 Kudos
1 Solution
Bernard
Valued Contributor I
1,000 Views

>>> when I am working with 10GB of data.>>>

Maybe you are running out of the process heap by  allocating 10GB of data?

Try to run tests with the smaller data sets.

View solution in original post

0 Kudos
13 Replies
David_A_Intel1
Employee
1,000 Views

Hi Maria:

Best thing is to submit an issue at Intel® Premier Support and provide the zipped up results directory.  We would need to examine the files to try to determine what is wrong.  Also, please execute the command 'amplxe-feedback -create-bug-report <report>.zip', where <report> is the filename, including a writable directory path, and attach it to the issue.

0 Kudos
Ayam
Beginner
1,000 Views

@MrAnderson I can not access the link  Intel® Premier Support.  "Cannot connect to the real premier.intel.com".

0 Kudos
David_A_Intel1
Employee
1,000 Views

Hmmm, try http instead of https?

0 Kudos
Ayam
Beginner
1,000 Views

@MrAnderson still the same SSL error.

0 Kudos
Bernard
Valued Contributor I
1,000 Views

Maria M. wrote:

@MrAnderson still the same SSL error.

What is the description of the error?

Have you tried different browsers?

0 Kudos
David_A_Intel1
Employee
1,000 Views

I apologize for the delay.  I got sidetracked. Thanks to ilyapolak for bringing this thread back into my sights. ;)

Actually, I see that Maria has a "student" license for the C++ Studio XE product.  Student licenses do not receive Premier support, only forum support.  As such, support is provided as the support team has available bandwidth.

Let me review this thread, again, and see if I can offer any suggestions.

0 Kudos
David_A_Intel1
Employee
1,000 Views

okay, so, Maria, we need more details about what you are trying to do. Are you using the GUI or command line?  I see your are profiling "hadoop".  Did you have VTune Amplifier XE launch an app?  How did you specify the app?

How long was your profiling run?  Did you set the estimated duration in the project properties?  There is a warning that there is a lot of data.  You should try profiling for less time and see if that works.

 

0 Kudos
Bernard
Valued Contributor I
1,000 Views

>>>Thanks to ilyapolak for bringing this thread back into my sights. ;)>>>

You are welcome.

 

0 Kudos
Ayam
Beginner
1,000 Views

@MrAnderson. thanks

Are you using the GUI or command line?
I am using command line option. "
/opt/intel/vtune_amplifier_xe_2013/bin64/amplxe-cl -collect snb-bandwidth -app-working-dir /usr/local/hadoop -- /usr/local/hadoop/projwc"

Did you have VTune Amplifier XE launch an app?  How did you specify the app?
I have written a simple c code that call the java application. 
#include <iostream> #include <stdio.h> #include <stdlib.h> #include <string.h>

int main(int argc, char* argv[])
   float value;
  FILE  *child = popen("bin/hadoop jar hadoop-*-examples.jar wordcount rand rand-wc10Gb", "r");
    fclose(child);
    return 0;"

How long was your profiling run?
amplxe is giving me Elapsed Time:  3143.350
You should try profiling for less time and see if that works.

I am using wordcount example of Hadoop. If I generate random data of 1GB on HDFS then profiler is working fine. However, I am getting "amplxe: Warning: Cannot load data file `/usr/local/hadoop/r011bw/data.0/tbs646344590.tb6' (tbrw call "TBRW_dobind(tbrwFile->getHandle(), streamIndex)" failed: invalid string (97))." when I am working with 10GB of data.
 

 

0 Kudos
Bernard
Valued Contributor I
1,001 Views

>>> when I am working with 10GB of data.>>>

Maybe you are running out of the process heap by  allocating 10GB of data?

Try to run tests with the smaller data sets.

0 Kudos
David_A_Intel1
Employee
1,000 Views

Hi Maria:

You don't need a wrapper around the hadoop binary.  You should be able to specify /usr/local/hadoop/bin/hadoop as the application and then specify "jar hadoop-*-examples.jar wordcount rand rand-wc10Gb" as arguments (unless the '*' is being expanded by the command shell).  Is rand-wc10Gb a data file for the app?  Is there a smaller one? :\

Yes, that is a lot of data and time.  You should ensure that the "Duration time estimate" is set to "Over 15 minutes" (in the command line that is the "-target-duration-type" option set to "long", e.g., "-target-duration-type=long").

0 Kudos
Ayam
Beginner
1,000 Views

@iliyapolak ... it worked. thanks alot

@MrAnderson  thanks for your input. I will experiment with your explanation.

 

0 Kudos
Bernard
Valued Contributor I
1,000 Views

@Maria

You are welcome:)

0 Kudos
Reply