Intel® Fortran Compiler
Build applications that can scale for the future with optimized code designed for Intel® Xeon® and compatible processors.

Slow debug when using structures

eos_pengwern
Beginner
631 Views
Hello,

In an effort to make my code more re-entrant and easier to parallelize, I have rewritten it to remove 'module variables', and instead place them into structures, to which a pointer can be passed to the module procedure when it is called. In effect, I've created a bunch of classes, for which the former module-variables constitute properties and the module procedures constitute methods, in good Fortran 2003 fashion (or at least, I'd like to think so).

However, when running the 'debug' version of the resulting code, it runs a full ten times slower than the old code. When running the 'release' version, it is about the same (if anything a little faster, but there's not much in it). I've checked that all the compiler options are the same for each version. I can understand the debugger having a bit more work to do when having to keep track of structures and pointers, a lot of which the compiler probably optimizes away when preparing the release version, but 10X seems excessive and slows down the development process quite substantially.

Is this usual, and is there a way of mitigating it? By the way, I'm running IVF 11.0.074 on VS2005.

Stephen.


0 Kudos
1 Solution
Steven_L_Intel1
Employee
631 Views
You can debug a DLL by specifying the path to the EXE in the Debugging property page, but if the slowness exists without running under the debugger, then it isn't debug-related.

View solution in original post

0 Kudos
6 Replies
Steven_L_Intel1
Employee
631 Views
I'm not familiar with this as a problem. Do you have any breakpoints set for the project? How large are these structures (how many fields)?
0 Kudos
eos_pengwern
Beginner
631 Views
I'm not familiar with this as a problem. Do you have any breakpoints set for the project? How large are these structures (how many fields)?

I've had a few breakpoints set, but I haven't noticed any particular hotspot. The structures are quite large, each one containing a few dozen allocatable arrays with a few hundred thousand elements - I'd guess each structure is ~5-10MB in size. Certainly if the debugger is having to load whole structures when previously it would only have had to load individual arrays, then it has a great deal more work to do.

Stephen.
0 Kudos
Steven_L_Intel1
Employee
631 Views
Do you get slowness when you do Start Without Debugging?

The debugger is not going to look at the actual data, just the symbol table information.
0 Kudos
eos_pengwern
Beginner
631 Views
Do you get slowness when you do Start Without Debugging?

The debugger is not going to look at the actual data, just the symbol table information.

Well, the Fortran code is in a DLL called from MATLAB, so I can't 'start without debugging' from the menubecause, if I want to view the variable values etc., I have to attach to the parent process before the DLL is called. However, I do see the slowness even if I don't attach to the process, or for that matter even if I don't have Visual Studio open when I run the code, which I guess is the equivalent.

Stephen.
0 Kudos
Steven_L_Intel1
Employee
632 Views
You can debug a DLL by specifying the path to the EXE in the Debugging property page, but if the slowness exists without running under the debugger, then it isn't debug-related.
0 Kudos
eos_pengwern
Beginner
631 Views
You can debug a DLL by specifying the path to the EXE in the Debugging property page, but if the slowness exists without running under the debugger, then it isn't debug-related.

D-oh! All this turned out to be due to an error in my own code...

Many layers down, in a routine called from every part of the application, I had left a loop of the form:
[cpp]do i=1,n    ! n~500
    struct%A(:,j,k) = struct%x(:,j,k) + struct%y(:,j,k)
end do[/cpp]
...which was an artefact of a change I had made to the looping architecture at the same time as introducing the structures. Naturally this slowed things down to a huge degree, without affecting the result. In the Release compilation, the compiler either spotted this and eliminated it, or else the vectorization worked so well that the time wasted was negligible.

So it was all my fault, not the debugger's. I'm sorry.

Stephen.
0 Kudos
Reply