Okay, here goes:
So technically, most things are running on 32bits right? Your running on 32bit OS and I'm running on a 32bit OS. (I think) And well... If 32 bit defines various values of say integers, what will happen when we get to 128bit standards? Will they increase? And if so, by how much? Like the int can hold some 2million on a 32bit machine. How much could it hold on a 128bit machine?
Or am I totally misguided on this?