Im on a chapeter in my C book talking about Integer Types. It talks a bit about Short/long/signed/unsigned. For example says the long covers a certain range of numbers and same goes for short. But how am I suppose to know what this range is? It says its different for differnt computers.

say I had an int for example

int dogs = 10000000;

is this supposed to be just int, or long?

long int dogs = 10000000;

A friend also told me not to worry about this since computers now have enough ram, but I still want to understand this.