
atof()
I use atof() in a function but i get stranges values..
when converted I get the following:
3.4
3.39999999999999991118215802999
3.3
3.29999999999999982236431605998
3.2
3.20000000000000017763568394002
2.1
2.10000000000000008881784197001
Code:
string temp_num = "";
char dot = '.';
while ((isdigit(temp_string[n]))  (temp_string[n] == dot))
temp_num += temp_string[n++];
number_value = atof(temp_num.c_str());
what is wrong here? atof() itself?
I also tried atol() but it wouldnt convert to decimal?
I know someone else asked the same question but using a correction factor in my case wont solve my problem.. :(

"...but i get stranges values"
I don't see any strange values. Where are they?
"what is wrong here?"
Where's here?
"I also tried atol() but it wouldnt convert to decimal?"
Maybe that's why they hit upon the function name "alpha to long"?
You aren't going to get an exact representation of a base 10 number using base 2 which computers use. However, within about 10 decimal places, they are exact.


Yep, floats aren't perfect. You are either going to have to find some way to deal with that, or use doubles.