Hello,
I am running a lightly random-based heuristic search with a problem, and I encounter something quite peculiar.
When I run the program using no compiler optimization, I obtain results that are consistently worse than when I use compiler optimization (-O3 parameter using gcc). What I mean by "results" is the actual solution (the goal being to find the solution with the lowest possible cost). I am not concerned with memory used and CPU time spent, which should be the only two things optimization should affect.
I am running both versions of the search on the same OS, same machine, same eveything, except for the optimization parameter. I have run both versions of the search several times, and the results are always consistently better for the optimized version.
I am wondering if it makes sense that optimizating with a compiler change the outcome of the solutions. I was under the impression that the results of a program should be identical. Is it possible optimization affects the random number generator?
Thank you for your help,
Marc