>> Simply "growing" to fit every situation is wasteful.
How so? When the program starts up, all buffers are empty. The user requests @(20). The buffer grows. Now, if the user only indexes up to 20 into the buffer during the entire execution of the program, then the buffer stays the same size, so no memory is wasted. It seems very efficient to me.