- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
Hi All,
I have a strange bug in a large numerical simulation model that uses the Windows Intel Compiler (most recent release). I apologize that the details are so brief, but its largely do to the fact that I am completely stumped on what is wrong. The code runs all example problems perfectly in debug mode and with release only when Optimization is set to /O0 (not for /O2 or /O3).
I do have the Floating-Point Model set to Source (/fp:source) and Floating-Point Speculation to Fast. I have played with every opt turning stuff on/off, but the only thing that seems to work is /O0.
Any help would be greatly appreciated.
링크가 복사됨
- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
I have identified that it does work when I have Inline Function Expansion set to Disable with /O3
Any comments regarding this, would be greatly appreciated.
- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
I apologize, I tried to delete this post, but can not. Shortly after writing it, I received an email say that one subroutine without implicit none was passing the a wrong variable name. The program now runs fine.
It still is very strange that with that error I would still get a correct result with debug mode, but not with release.
I apologize for anyone's time I wasted with this post.
- 신규로 표시
- 북마크
- 구독
- 소거
- RSS 피드 구독
- 강조
- 인쇄
- 부적절한 컨텐트 신고
Thanks for letting us know. Optimization can sometimes uncover programming errors not seen without optimization.
