Hi,
I am just messing around on some pre-existing project, and I noticed that if I use something like:
Code:
double a = 1.354;
double b = 5617963.0 + a;
Stepping through the debugger i'm getting:
a = 1.3540000000000001
b = 5617964.5000000000
As if the addition of two doubles is only using single precision?
Yet if I create a dummy test project and copy in those exact same lines I get:
a = 1.3540000000000001
b = 5617964.3540000003
Which seems much more like a double should preform. Both projects use Visiual Studio 2008, so my question is, why is the former losing so much precision, and is there an option in VS that would cause it to do that?
Edit:
Even making sure both numbers are doubles I still get the same results:
Code:
double a = 1.354;
double b = 5617963.0;
double c = b + a;
Thanks.