I'm still not totally skilled to find the time complexity so bear with me and be on touch with me to the end please.
I've three examples over here, but don't know actually how exactly to find the time's complexity, I know how to goes, but practically how to find is still doubted me.
First example:
Code:
void f(int n)
{
int j, s;
for(j=0, s=1; s<n; j++,s*=2)
printf(“!”);
double values[j];
for(int k=0; k<j; k++)
values[k]=0;
while(j--)
for(int k=1; k<j; k++)
values[k]+=1.0/k;
}
the time complexity for it should be : O(log^2(n))
Code:
void f(int n){ for (int i=1; i<=n; i++) for (int j=1; j<=n*n/i; j+=i) printf(“*”);}
the time complexity for it should be : O(n^2)
Code:
void g(int n, int i){
if(i*i>n) return; g(n, i+1); printf(“#”);
}
void f(int n){ g(n, 0); g(n, n/2);}
the time complexity for it should be : O(sqrt(n)).
I know there's algorithm for calculating the complexity, and I already known of it, but how practically it goes on codes isn't totally understood for me, so if you can give me hints why those answers of three examples step by step are correct then thanks in advance and much appreciated.