Hello,

I had a few questions about memory. I scraped up these lectures today while wandering the web.
Free Online Computer Science Course | Bits | Harvard Extension School

While listening to the lectures about entropy and bit storage I couldn't help but feel like I was using way to much data over the course of my programs. In situations where I use built in data types like int that never reach anywhere near their maximum possibility am I wasting a ton of data at runtime if my average number in my program only uses numbers under 1000?

If a 32 bit int is used for a number that has a max value of 1000 (and never is negative) am I wasting 21 bits for every integer I declare in the program? If I had say 1000 classes active that all carried these integers in them would I be wasting 21,000 bits as it could be reprsented as 11 bits (or possibly less using frequency compression concepts) of information instead?

If these do reduce space is it storage space (size of the file) or the runtime memory I guess that you would see in the RAM.

Sorry if one question turned into five. I just became paranoid that I was throwing tons of bits in the trash (declaring and alloting but never using them) in my programs.

I realized I am kinda confused on the subject.