Well if we are going to be analyzing large sets of random data we cant do it in matlab because matlab is SUPER slow
so i wrote some autocorrelation in fortran. fortran is terrible at a lot of modern programming stuff, but its fast. For example, it is a major pain to set a parameter from a command line argument, or to get the length of a file. I am not a master at fortran, so i might just be incompetent but believe me it seems like too much of a pain.
So i have to set the autocorrelation dimension and data set size manually before compiling it. but anyways, here it is:
PROGRAM bob ! USE OMP_lib INTEGER, PARAMETER :: P=140067,M=70000 REAL(8) :: sum,out(M),data(P),VAR,P2 INTEGER :: k,j P2 = P; open (unit = 2, file = "data.txt") do k=1,P read(2,*) data(k) end do sum=0 do k=1,P sum = sum + data(k) end do u = sum/P sum = 0 do k=1,P sum = sum + (data(k)-u)*(data(k)-u) end do VAR = sum/(P-1) write (*,*) "STANDARD ERROR:" write (*,*) 1.96/SQRT(P2) do k=1,M out(k) = 0 do j=k+1,P out(k) = out(k) + (data(j)-u)*(data(j-k)-u); end do out(k) = out(k)/(VAR*P) end do open (unit = 7, file = "CORR.txt") do k=1,M write (7,*) out(k) end do close(7) END PROGRAM
the parameter P is the length of the file, i find it with wc -l. M is the dimension, must be less than half the length of the data set. it outputs a file which is the autocorrelation data to let you load it in matlab with data = load(‘ ’). As of now, the input file is a list of numbers seperated by newlines
it rips through the ~140000 element set in a few seconds
future directions will be making it so it can read a binary file and so i dont have to set the damn set length manually and set the correleation dimension at the command line.
NO comment provided just his comments.