Letting go of 'old ways'
Am I the only one who still uses low bit-sized variables when I know for a fact that the application will only run on 32-bit systems? I know that most of this stuff is getting the hell padded out of it, yet it just seems horrible to occupy 32-bits of memory when I only need maybe 3 bits of it.
Likewise I conserve memory everywhere else I come across the opportunity. I have on occasion poured through my classes searching for memory saving possibilities. And yet, the average computer in the store today has 128 MB or so of RAM. Its not like shaving a few K off is speeding allocation up to any noticable extent. But its just *drum roll*... good coding practice.
Which brings me to many other points. So many of the things that we (well, _some_ of us) think of as standard practice, are often irrelevant on the systems they will be run on. For instance: I avoid many functions due to their slowness or lack of optomization; Yet I continue to avoid them, using my own more complicated measures, even when writting up some useless little application whos need for speed and optomization is nil. I optomize and spend countless hours parring down functions which aren't even close to bottlenecks in the application.
Why? Are we just holding on to the days when these little things were so important?
(Note: I'm not talking about ignoring good memory management, or sloppy code resulting in slow performance, I just wonder about how tiddly I (and I'm sure others), can get whith code that doesnt need that kind of attention. Just because we can, does this mean we should waste the time to do it?)
I'm hoping to hear from more of the people that have coded a while, rather than people who dont think theres anything wrong with using a 10x10 2D array when every other row uses only 7 elements. ;)
That's one of the things that slows me down coding, looking for a smaller, more elegant way of doing something. Getting something to work is only half the job; getting it to work better, more efficiently and simply is the other half.
I know I don't exactly fit the criteria of respondand you wanted but I've often thought about this when trying to make something more efficient. So I waste 3Kb of mem, if my program isn't memory/time/efficiency critical then why don't I just leave it and go and play football or something. I do have 512Mb of ram anyway and only I'm going to be using my progs.
I'm an undergrad chemist and things that used to be the norm are always being replaced by uglier yet more productive methods.
Despite what I think it just feels wrong when coding to leave something you know is ugly, so instead of playing football I managed to squeeze back my 3kb so it can sit there empty instead.
actually i feel the same way l@d.
if often sit and wonder about it...
I can't stand it, if i can squeez a little bit more out of it, i won't be able to sleep good till i do.
even on the stupidest little things!
maybe its a compulsive disorder?
IMHO, it's all down to your principles and "up-bringing". I try my best to make everything as efficient and neat as possible, and will spent hours do so. I do it because I pride myself in doing what I think is right, regardless of the fact the run-time differences will be practically non-existant :rolleyes:
What does get me though, is data storage methods like packed decimal, binary coded decimal etc. I work with people that still code new programs that use these data layouts, and why? To save a byte here and there?! I mean, it's not like we're short on disk space these days. I can understand this if the data is going to be transmitted across a network, at least then, you'll be sending half the number of bytes; but when the data doesn't leave the host it's created on, whats the point?
Anyway, "compulsive disorder" .... yep, sounds about right. :)
>>maybe its a compulsive disorder?
Yup, I'd say so too. I dont see myself cutting these habits anytime in the future, but I have to wonder about my sanity. Just because nobody is going to see it doesnt mean it shouldnt be done right. ... Yet at the same time I dont like to waste time...
>>is data storage methods
I am proud (and ashamed, sometimes) to say that I have finally thrown away any cares about the actual size of my application. The way I see it, I'm developing games. Games contain soooo much additional media that it makes the size of the actual executable and DLLs negligable. Yup, I'm a horrible, horrible person, born from the ashes of my former bloatware hating self into a world where people have too much damn HD space to waste.
I do it, too. I hate my programming teacher - "Why are you using THIS here? We haven't gone over ANY of this. Just use the code I gave you!" - "You mean the 800-size array of apstrings (stupid, stupid class our county gives us)?". I'm sorry, but I can't stand wasting memory, or slowing down my app because I can do something easier - That is, unless, I'm testing the logic of something. Then I'll go back and put in my optimizations once I'm sure it'll work; Nothing's worse than trying to figure out where your logic's wrong when you've got 900-lines of optimizations on a string comparison :)
i abstract as much as possible... but i end up micromanaging anyway. i can stand using ints for numbers, but not for true and false values... 31 bits wasted.
Yes, 32-bits for single-bit values are the worst. I dont use int in favor of being specific: short or long. I really only use int when I'm trying not to confuse some poor newbie.
The more experienced you are with the C/C++ language, the better you can judge practical overuse of memory given a situation where it might increase performance or compatibility. On some of the middleware platforms you know that an int or a short int or a long int, will be implemented with a definate number of bytes, but this isn't true with unmanaged compilers.
While on the subject, if anyone wants to comment on the efficiency GLIB than I'd appreciate it.
From a software management standpoint, as long as it fits the requirement then it should be fine. Spending extra time doing unnecessary optimizations will cost money. Lets say you were managing a large team project and everyday costs you $50K. Would you want your team spending extra time doing extra work?
However, programmers have a sense of pride about their work and they have their own quality standards they want to meet before releasing the code. Keeps them happy and productive. This in itself is a cost saving factor.
All of this is assuming you're coding for a company.
You say, it's a waste of time to use an int if you only need a char. But do you have heard about alignment or padding?
on 32bit intel systems an int is the fastest thing possible. So everything is aligned! You may use a bool, but the compiler makes it an int, because an int is faster...
So why not using an int?
with memory allocation it's different. But I prefer allocating a bit too much over allocating a second time and copying...
So it is always the question: what do you prefer: small size or speed.
But I think today it is most important to finish the coding quick :(
I'm the same way lightatdawn, but I've recently worked more for clarity than overal efficiency. For example, if I use more storage and the program is easier to follow than if I had skimped, I use more storage. I'll often go for a double linked list than single, a double instead of a float, or shellsort over quicksort if the shellsort does the job sufficiently. Clarity is one of my favorite ranting subjects. ;)
Yes, ease of use is one of the few things that will get me un-skimping. Doubly linked lists over singly linked is also my preference.
>>But do you have heard about alignment or padding?
Yes, I mentioned it in my first post. "systems? I know that most of this stuff is getting the hell padded out of it, yet it just seems horrible to occupy 32-bits of memory when I only need maybe 3 bits of it." I'm aware of the speed issues and speed wins over small memory enhancements. Optimized code uses optomized memory allocation in every way possible. BOOL over bool etcetera.
I agree with you in many ways, l@d, but you also must consider the size of the project.
Think about 3ds Max for example. It already uses a good 60% (at at least) of a computer's resources on a 128 MB machine. If the developers for 3ds Max did no optimizing, just think of how much resources it could take up. a 128 MB machine probably couldnt even run it if no optimizing had been done.
So for little stuff, it doesnt matter, but for a large major project, optimizing can be everything.