Forgive me, but I understand this should belong on the MKL Forum, but I am really only interested in comments from this crowd.
I was testing some FFT code written in Fortran, not MKL. If you have an analog peak in continuous data that lies midway between the digital peaks of a FFT signal, you can obtain two peaks, one either side of the "true" peak and with lower amplitudes.
It was an interesting problem to play with recently. I was reminded of it as I prepare some notes for people on the use of FFT. You need to explain all of the issues in understanding FFT.
This morning I was looking at some FFT notes that listed all of the ways you have to adjust the matlab output to make the FFT "correct".
Anyway enough belly aching.
Not sure, but I actually created some samples in EXCEL and then sampled the data at different rates so I did not pick up the exact peak, on the output of the FFT you would see two peaks close together, of course if you run the FFT at finer steps, you might discover the result.
The problem is balancing the length of the input file, the data input rate and the time people are prepared to wait in the snow to collect data. More than 4 minutes and people start to call you nasty names like "You only program in C and such nasty things." I like 8 minutes, but that is a long wait on a windy bridge when you are hungry and you have ten spots so to do.
So, I use a 16384 time steps at 2000 steps per second, and usually graph 51 time sets for about 7 minutes, you can pick up a lot in 7 minutes, you get something like
This covers 62.5 Hz against 51 pseudo time steps. The size of the graphs are constrained by programming limits, not a desire to make them bigger.
I am not sure what you mean by the 5 axis.
The chart is produced using the contouring package CONREC, which is first listed in BYTE in the mid 80's developed by an Australian programmer. I have translated it into three languages so I can use it with a lot of stuff, as long as the data is in a regular grid it is super fast and easy.
The two versions can either do the amplitude as the contours or the count, count is often better in terms of finding a peak. It also is simple to do count raised to a power, which amplifies some elements. I like powers in the range of 0.5 to 3.
This was the first non-EXCEL graph and it gave us access to a world of visual data quickly.
Now, we seek changes in the bands you can see on the graph. We can measure changes in frequency with a slope of 1.0 E-8 with ease. So now we can predict failure in the future.
The critical issue is peak finding as the data is Gaussian, non-Gaussian and thermal as a base all mixed together.
Lot of fun.
It represents all the ways you can count an FFT to Time based data presentation.
milli-g is easy to use as it is generally for our work in the range of 0.1 to about 250 mG a nice set of numbers, handling 0.001 g is not as nice as 1 mG and the lowest anyone can get is about 0.1 mG for high frequency thermal , we can measure with a delta step of 10-20 microG, there is some debate about this, the theoretical is 20, the practical seems to be on the side of 10.
So we know each point as about 100 +- 20 microG.
But if someone says they have results to 20 microG they are lying you cannot get through the thermal layer like the background radiation.