Hi! I'm stuck with a new problem here, and I would really appreciate some help.
lets see... what I want to know is in how many years will the variable money duplicate itself, having an interest rate of 5% per year . For example, if I start with money = 5, how maney years it will take to money = 10.
Ok, I wrote this code:
Code:
#include <iostream>
using namespace std;
int main()
{
float money;
float money2;
float money3;
double rate;
int x; //years
rate = 0.05;
cout<<"Write money: ";
cin>> money;
money2 = money * 2;
money3 = money;
x = 0;
while (money <= money2)
{
money = money + (money * rate);
x++;
}
cout<<"\n";
cout<<x<<"\n";
cout<<money3<<"\n";
cout<<money2<<"\n";
cout<<money<<"\n";
system("pause");
return 0;
}
the money2 and money3 variables are unnecessary but i put them just to make sure everything was ok.
The input is 5, and the output goes like this:
15
5 ( unnecessary )
10 ( unnecessary)
10.395 (in this case, WHY is the value higher than 10??? I mean, I put in the code:
money <= money2, so why is it higher?)
And the other thing is, why is x (the years) always 15?? if input 2, the output is the same: 15, and if i put 1000 its again 15.
So... that's the two questions I'm stuck with. What I'm I doing wrong?