We have a static library in C++ which is capsuled with socket layer. We performed a Compiler upgradation to ICC successfully on Linux64 platform.
This static library's optimization level was added as O1 as with O0 we had earlier faced some issues.
We have a shared object, which uses this static library(optimized O1) and another one which is not optimized(default O3).
Now, we have core dumps observered for our process and the stack trace for this points to some functions of this static library.
Basically we are unable to point the root cause of the core dumps due to optimization.
Can there be any issue with the coexistence of the optimized as well as non-optimized libraries?
Please let us know if you any idea regarding this.
Thanks in advance,
This seems confusing; for example, how did -O3 become a default?
-O1 may be a good option, as opposed to default -O2, depending on your application, but I wouldn't count on it to avoid all interprocedural optimizations. If you want -O0, which may have been the default in a gcc build, that's a default for icc only when implied by -g or -debug.
Are you referring to something which might be helped by "-debug inline-debug-info" which ought to allow you to identify locations in backtrace even within in-lined functions? You can combine that option with -O1 if you want reasonably compact code in your libraries, and you could invoke additional options to restrict interprocedural optimization.