I've been doing some instrumentation in my application which logs an immediate value at a fairly high interval. I've noticed something about how the data is represented in the resulting timeline which I'm trying to understand.
For the purpose of discussion, assume a CSV with data points such as the following:
In the default zoom level, these value actual render on the timeline at nearly 2x the values in the CSV (i.e. the graph shows 6.6 million instead of 3.3 million). If I zoom in further, I start to see values more appropriate (i.e. 3.3 million).
It seems like VTune is somehow accumulating the immediate values. Is there some maximum reporting interval where I'm not supposed to be generating CSV lines which are too close together in time? For example, am I only supposed to be able to report one value per millisecond, and if I report more than that it will add the values together?
The data as shown currently is highly misleading and it would be great to better understand how to avoid these issues. If the answer is something along the lines of "Don't log more than 1 time per X ns/us/ms", then it would be good to know what the threshold is.