- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I've been doing some instrumentation in my application which logs an immediate value at a fairly high interval. I've noticed something about how the data is represented in the resulting timeline which I'm trying to understand.
For the purpose of discussion, assume a CSV with data points such as the following:
1126802252083641,3395040,16648,16648
1126802258614980,3281520,16648,16663
1126802346357386,3500640,16648,16648
1126802352506840,3322440,16648,16648
1126802359003661,3292080,16648,16663
1126802446447006,3508560,16648,16648
1126802459008063,3302640,16648,16663
In the default zoom level, these value actual render on the timeline at nearly 2x the values in the CSV (i.e. the graph shows 6.6 million instead of 3.3 million). If I zoom in further, I start to see values more appropriate (i.e. 3.3 million).
It seems like VTune is somehow accumulating the immediate values. Is there some maximum reporting interval where I'm not supposed to be generating CSV lines which are too close together in time? For example, am I only supposed to be able to report one value per millisecond, and if I report more than that it will add the values together?
The data as shown currently is highly misleading and it would be great to better understand how to avoid these issues. If the answer is something along the lines of "Don't log more than 1 time per X ns/us/ms", then it would be good to know what the threshold is.
Thanks in advance,
Devin Heitmueller
Link Copied
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page