Hi guys, there's something confusing me about size of data type and not making sense for me which I need more elaboration about this point.
lets assume I define in my compiler
short x = 10000000000000000000;
then if I do printf for printing number x then the number that's shown isn't the same value that I assigned to the variable x and that's because short has limited size, here all fine for me.
what's confusing me is that how pc prints another value although I assigned deterministic value to x which is
short x = 10000000000000000000 but it prints something else .. I know size of short is limited but what does that mean and why will affect the value that I'm printing .. I know the value that I assigned is out of the limit, but what does that mean, why PC will not print x as it's although its value out of the size limitation of the dataType.
thanks