Question on the Bits...
Okay, here goes:
So technically, most things are running on 32bits right? Your running on 32bit OS and I'm running on a 32bit OS. (I think) And well... If 32 bit defines various values of say integers, what will happen when we get to 128bit standards? Will they increase? And if so, by how much? Like the int can hold some 2million on a 32bit machine. How much could it hold on a 128bit machine?
Or am I totally misguided on this? :)
On 16 bits machines, an integer can be 16 bit, on 32 bit machines, an integer can be 32 bit, this is compiler dependent.
If you go from a 32 bit machine to a 128 bit machine and your compiler supports 128 bit integers, then variables of type integer can hold larger values than on 32 bit machines.
Max values can be calculated like this:
32 bit -> max value is 2^32-1
128 bit -> max value is 2^128-1
So the difference is (2^128-1) - (2^32-1).
wow, unless I did my calculations wrong, that's a pretty big number. :P I got 3.4028236692093846346337460743177e+38 I hate scientific notation, lol. Sure, it serves its purpose, but it only makes it impossible to get the real scope of a number. Anyway, it's big. Up in the trillions of trillions of trillions (litterally) it appears.
( 340,282,366,920,938,463,463,374,607,431,770,000,00 0 ? )