- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I am running an application on Intel Vtune (vtune_amplifier_xe_2013) on the linux system (ubuntu).
However, intel vtune is not load data into the target file.
amplxe: Collection stopped.
amplxe: Using result path `/usr/local/hadoop/r025ge'
amplxe: Executing actions 0 %
amplxe: Warning: The result contains a lot of raw data. Finalization may take a long time to complete.
amplxe: Executing actions 14 % Loading data files
amplxe: Warning: Cannot load data file `/usr/local/hadoop/r025ge/data.0/tbs1546692947.tb6' (tbrw call "TBRW_dobind(tbrwFile->getHandle(), streamIndex)" failed: invalid string (97)).
amplxe: Executing actions 50 % Generating a report
The result file r025ge is empty.
Thnaks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
>>> when I am working with 10GB of data.>>>
Maybe you are running out of the process heap by allocating 10GB of data?
Try to run tests with the smaller data sets.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Maria:
Best thing is to submit an issue at Intel® Premier Support and provide the zipped up results directory. We would need to examine the files to try to determine what is wrong. Also, please execute the command 'amplxe-feedback -create-bug-report <report>.zip', where <report> is the filename, including a writable directory path, and attach it to the issue.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@MrAnderson I can not access the link Intel® Premier Support. "Cannot connect to the real premier.intel.com".
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hmmm, try http instead of https?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Maria M. wrote:
@MrAnderson still the same SSL error.
What is the description of the error?
Have you tried different browsers?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I apologize for the delay. I got sidetracked. Thanks to ilyapolak for bringing this thread back into my sights. ;)
Actually, I see that Maria has a "student" license for the C++ Studio XE product. Student licenses do not receive Premier support, only forum support. As such, support is provided as the support team has available bandwidth.
Let me review this thread, again, and see if I can offer any suggestions.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
okay, so, Maria, we need more details about what you are trying to do. Are you using the GUI or command line? I see your are profiling "hadoop". Did you have VTune Amplifier XE launch an app? How did you specify the app?
How long was your profiling run? Did you set the estimated duration in the project properties? There is a warning that there is a lot of data. You should try profiling for less time and see if that works.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
>>>Thanks to ilyapolak for bringing this thread back into my sights. ;)>>>
You are welcome.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@MrAnderson. thanks
Are you using the GUI or command line?
I am using command line option. "/opt/intel/vtune_amplifier_xe_2013/bin64/amplxe-cl -collect snb-bandwidth -app-working-dir /usr/local/hadoop -- /usr/local/hadoop/projwc"
Did you have VTune Amplifier XE launch an app? How did you specify the app?
I have written a simple c code that call the java application.
" #include <iostream> #include <stdio.h> #include <stdlib.h> #include <string.h>
int main(int argc, char* argv[])
{ float value;
FILE *child = popen("bin/hadoop jar hadoop-*-examples.jar wordcount rand rand-wc10Gb", "r");
fclose(child);
return 0;} "
How long was your profiling run?
amplxe is giving me Elapsed Time: 3143.350
You should try profiling for less time and see if that works.
I am using wordcount example of Hadoop. If I generate random data of 1GB on HDFS then profiler is working fine. However, I am getting "amplxe: Warning: Cannot load data file `/usr/local/hadoop/r011bw/data.0/tbs646344590.tb6' (tbrw call "TBRW_dobind(tbrwFile->getHandle(), streamIndex)" failed: invalid string (97))." when I am working with 10GB of data.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
>>> when I am working with 10GB of data.>>>
Maybe you are running out of the process heap by allocating 10GB of data?
Try to run tests with the smaller data sets.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi Maria:
You don't need a wrapper around the hadoop binary. You should be able to specify /usr/local/hadoop/bin/hadoop as the application and then specify "jar hadoop-*-examples.jar wordcount rand rand-wc10Gb" as arguments (unless the '*' is being expanded by the command shell). Is rand-wc10Gb a data file for the app? Is there a smaller one? :\
Yes, that is a lot of data and time. You should ensure that the "Duration time estimate" is set to "Over 15 minutes" (in the command line that is the "-target-duration-type" option set to "long", e.g., "-target-duration-type=long").
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@iliyapolak ... it worked. thanks alot
@MrAnderson thanks for your input. I will experiment with your explanation.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Maria
You are welcome:)
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page