how do you make an integer accept a decimal value?
I know i knew this, but now...and it's not in the faq
somebody please help
Printable View
how do you make an integer accept a decimal value?
I know i knew this, but now...and it's not in the faq
somebody please help
Use a float, not an int?
use a doubleQuote:
how do you make an integer accept a decimal value?
eg;
double n = 6.0
U R right, pretty simple, my friend
cj
u could cast it.
int variable;
int final;
final = double(variable);
errr... no. final is an int and will only ever hold an int value. The use of the implied cast is also incorrect, you meant this:Quote:
Originally posted by Ride -or- Die
u could cast it.
int variable;
int final;
final = double(variable);
>final = (double)variable;
... but it wouldn't store the digits to the right of the decimal point, if there were any. Of course, there can't be any though, as variable is an int as well.
wow....
thanks for all the alternatives, but i think i'll stick with "float".
:D Sorry to trouble you with such a simple question.
I don't see anything wrong with double(variable).Quote:
Originally posted by Hammer
>>final = double(variable);
The use of the implied cast is also incorrect, you meant this:
final = (double)variable;
It is C++-style casts.
Nevertheless, the cast is useless.
I don't think we can cast with: double(variable).
it's just like:
isn't it?Code:int x=5;
double y=9;
x=y;
>>It is C++-style casts.
Oops, you're right, I didn't spot that :rolleyes:
>>Nevertheless, the cast is useless.
That was my main point too.