im trying to calculate the sum of 1/1 + 1/2+...+1/100
Code:#include <stdio.h> int main() { int i; float res; res = 0; for (i=1;i<=100;i++){ res = res + 1/i; } printf("%f",res); getchar(); return 0; }
result: 1????????????????????
im trying to calculate the sum of 1/1 + 1/2+...+1/100
Code:#include <stdio.h> int main() { int i; float res; res = 0; for (i=1;i<=100;i++){ res = res + 1/i; } printf("%f",res); getchar(); return 0; }
result: 1????????????????????
You're doing integer division, where the remainder is thrown out, so it boils down to 1/1 = 1, and all the other divisions are zero's.
So the answer is just 1.
Usually, it is best to define the types in the code:
res = res + 1.0f/(float)i;
Though this is not a requirement.