Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chriss Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Executable crashes when optimization was enabled during compilation 1

Status
Not open for further replies.

globos

Programmer
Nov 8, 2000
260
FR
Hi,

I have a strange problem. When I compile my code with option -O1 or -O2(respectively creates the smallest code, creates the fastest code) and then I try to run it, it crashes. If the code was compiled with this optimization disabled, execution is ok.
Does someone ever meet the same problem? Maybe there are some typical kind of source code that is not caught by the optimizer but that should be.

--
Globos
 
Optimization will do that. To optimize, it needs to make some assumptions about how you're using various parts of the language. Sometimes those assumptions turn out to be false for your program.

There may be a way to provide "hints" to your compiler about how you intend certain things to be used.
 
It is not a strange problem (it's a normal way of operations;). You must see on your optimized program as a new one: start 'optimized' debug stage (write and run tests etc) before production runs.
Selective opt offs module by module, test output on check points, using assertions etc...
Any optimizations change memory (and stack) map, change floating point calculations order. It's another program!..
 
>Any optimizations change memory (and stack) map, change floating point calculations order. It's another program!..

I've been using g++ for a while and did not notice such weird crashes due to optimization compilation settings. I agree that an optimized program is another program, but it's behaviour must not be changed. To me whatever optimization is done, the compiler must at least ensure that execution of non-optimized and optimized compiled versions of an application must give the same results(not in terms of effectiveness, memory use, etc.).
If optimization makes some programs crash, I don't see any benefit in enabling it.

damned C++, damned microsoft.

--
Globos
 
Another program == another behaviour.
Damned C++?.. It's our profession, that's all.
It's an absolutely ordinal situation, not only in C++.
Don't worry, localize wrong module, separate code and compile it w/o optimization. Keep it up!
Well, very often an optimization can show hidden (in debug mode;) code/alg mistakes...
 
> Another program == another behaviour
Regardless of time and memory consumption, an optimized and a non-optimized versions of an application must give the same behaviour, i.e. achieving the same results, having the same control flow, and so on.

When I was talking about "crash" I meant that the optimized application I launched just stop at some point due to undefined error, which does not occur with the non-optimized. So it is just unacceptable, we can't say this is an absolutely ordinal situation. I don't see why it should be considered as normal that optimization might be unsafe in certain obscure cases.

Anyway as you said I must have done something wrong in C++ so that optimization just emphasizes a vicious bug. For the moment I can't see what is bugging. Microsoft's compiler is the perfect guilty from now :)

--
Globos
 
ALL optimized compilers are guilty 'primordially'. Apropos, any optimized code has not the same control flow by definition. An optimized code has another time diagram, another instruction/data cash demands etc.

Any programming language standard defines more or less issues about 'undefined' or 'implementation-defined' results. Small deviations may be masked in a straightforward debug mode compilation code but may show itself in a dramatically reordered optimized control/data flows...

Yes, compiler implementors make an effort to generate 'equivalent' code. But let's remember: code equivalence problem is algorithmically insoluble one. Sometimes they use (more or less) euristics to reorder, move or eliminate some codes. Finally, they are simply and only programmers (as you and I). It's enough...

Problems with optimized programs are well known. It's a pity that you never hear about it. Yes, any (real world) project must include some time and effort to test and debug 'optimized' product code, to evaluate the real effect (may be zero or worst) - it's a very common practice.

Alas, it's a philosophy. But your original post is philosophy too. It goes without saying: no such specifically 'bad' sources with (documented;) no-opt idiosyncrasy...
 
Most compiler documentation provides plenty of help about what the optimisation actually does. It's well worth reading it.

Some of it is very easy: As a really silly example, when compilers optimise evaluation of boolean expressions they stop as soon as they know the answer (false and anything is false, so why find out what the "anything" is?). But many compilers allow you to evaluate the whole expression as an alternative option. This shouldn't be a problem, but if you're the sort of person who likes to change global variables in functions returning boolean values then you can get in a terrible mess because the "optimised" version doesn't call your function.

 
I have seen this problem a few times. Usually, it is a bug in the code that is masked by luck when optimizations are turned off. Once, it was a CString that was optimized away completely along with a separate function call. Adding a line of code to have a separate instantiation and initializaion of that CString did the trick.

Regardless of the source of the problem, the way to fix it that I've seen work is to simply debug the app with optimizations turned on. Turn on the option to Stop Always for all exceptions, and see where the crash occurs. Use the debugger like you would with a normal debug build. That should help you find it or narrow it down a little.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top