I found a piece of code a while ago with a for loop that I don't understand. The purpose is counting the number of bits.
Code:
#include <stdio.h>
int main()
{
unsigned int i;
unsigned int v = 14;
for (i=0; v; v>>=1) {
i+= v & 1;
}
printf("# bits %d\n",i);
return 0;
}
The question I have is what is happening here?
Especially the code in the "for" statement, (some shifting):
and later on (ANDing one bit or whole vector?):
I have no problem understanding bits and bytes. The result here is of course 3 bits.