Hey guys,

I think this question boils down to dynamically allocating memory. Is it more expensive/efficient allocating memory once, if I know the max size I will need (even if I'm likely to use only a small fraction of the allocated memory) ?

Let me explain my issue, and my possible approaches:

Per problem, I have a set number of variables, say 500. As I iterate through them I need to save and keep track of a few variables of interest in a subset list. I have no idea how many that's gonna be, until I go through all of them...so it can be anywhere from 0 to 500. Once that's done, I need to iterate through the newly created subset (now I know how large the subset is -- and its likely about 0-10 vars). However I do not know how many I need, until I examine the given variable..so its very dynamic.

Does it make sense to reallocate a new position for my subset list every-time I find a variable of interest dynamically? Or would it simply be better to allocate a subset array of size 500 initially, only fill up a small part of it, and just never look at the parts I don't need.

I imagine both approaches are correct...the first seems prettier...but its really efficiency I'm looking for...and I'm quite a n00b @ C.

Any thoughts, or alternative approaches very welcome!