Can you explain these bitwise operations?
I am working with a file that has a header with the size of the file encoded into 4 bytes like:
I have found someone's code that puts this together into the integer value, but i don't understand why they do each step. Is there anyone here who can explain this code to me?
The ID3v2 tag size is encoded with four bytes where the most significant bit (bit 7) is set to zero in every byte, making a total of 28 bits. The zeroed bits are ignored, so a 257 bytes long tag is represented as $00 00 02 01.
(I know the code's not C++, but I'm hoping you can explain what they're doing and I can convert it to c++)
//Read in the bytes (why do they read char instead of byte?)
char tagSize = br.ReadChars(4); // I use this to read the bytes in from the file
//Store the shifted bytes (why is it int, not byte?)
int bytes = new int; // for bit shifting
int size = 0; // for the final number
* Why are they combining these bytes in this way if they're
* going to again combine them below (in the line setting "size")?
//how do they know they only care about the rightmost bit on the 3rd byte?
//how do they know to shift it 7 to the left?
bytes = tagSize | ((tagSize & 1) << 7) ;
//Why do they use 63 here (I know it's 111111)?
//how do they know they only want the 3 rightmost of byte 2nd byte?
//And how know to shift it 6 to the left?
bytes = ((tagSize >> 1) & 63) | ((tagSize & 3) << 6) ;
bytes = ((tagSize >> 2) & 31) | ((tagSize & 7) << 5) ;
bytes = ((tagSize >> 3) & 15) ;
//how do they know to shift these bytes the amount that they do to the left?
size = ((UInt64)bytes | ((UInt64)bytes << 8) | ((UInt64)bytes << 16) | ((UInt64)bytes << 24)) ;