I am reading 4 individual bytes via a serial port. The 4 bytes combined represent a number in decimal. How do I calculate the number in code? I can do it on paper...
For example, I read the following 4 bytes.
In reality this represents the number 0x00000E1C (or 3612).
I know that. How do I code the computer to calculate that for me other than a long drawn out seperation of the hex digits using strings etc.?