Hi! I'm stuck with a new problem here, and I would really appreciate some help.
lets see... what I want to know is in how many years will the variable money duplicate itself, having an interest rate of 5% per year . For example, if I start with money = 5, how maney years it will take to money = 10.
Ok, I wrote this code:
the money2 and money3 variables are unnecessary but i put them just to make sure everything was ok.
using namespace std;
int x; //years
rate = 0.05;
cout<<"Write money: ";
money2 = money * 2;
money3 = money;
x = 0;
while (money <= money2)
money = money + (money * rate);
The input is 5, and the output goes like this:
5 ( unnecessary )
10 ( unnecessary)
10.395 (in this case, WHY is the value higher than 10??? I mean, I put in the code:
money <= money2, so why is it higher?)
And the other thing is, why is x (the years) always 15?? if input 2, the output is the same: 15, and if i put 1000 its again 15.
So... that's the two questions I'm stuck with. What I'm I doing wrong?