For now, I solved the problem by figuring out the max size, using resize and then erase.
Indeed, if the data sets get larger (I don't think it'll be the case, but I am not certain - will check it...
Type: Posts; User: santos03
For now, I solved the problem by figuring out the max size, using resize and then erase.
Indeed, if the data sets get larger (I don't think it'll be the case, but I am not certain - will check it...
Thank you guys. I was not aware of this temporary doubling of the memory. I guess, I'll have to give up push_back in this case and do it some other way.
Thank you for your help.
Hi,
I have a large vector of structures (each contains 9 doubles, 1 int, 1 bool). When I "push_back" one more entry (almost 17,000,000'th), I get a bad_alloc exception. I am attaching a small...