Hi,
I have an exam coming up and I need to complete this mock exam. If i post the code, can anyone tell me if my understanding is correct and where to look at next:
What will this code output:
What I understand is this. First i and j are defined as intergers, 1 value for both. Then there is a for loop, with the starting condition as i=0, theloop continues while i is more than zero and 1 is taken away from i each loop.Code:int i=1, j=1 for (i=0; i>0; i--) { j += i*(i-1); }; printf("%d",j+1);
The problem I have up to this point is initially i is set as 1, but then in the loop x is set at 0? Also what does += mean?