I'm about to undertake a fairly large research project and I'll be using a cluster of shared linux machines that are rebooted maybe once a year. I do not have superuser privileges on these machines. My previous work in C++ on these machines did not require that I create my own classes or use pointers to them. This time, however, I will have to do that, and while I would like to be able make a flawless program on the first go, I know this is not going to happen, and even if my programs compile without a hitch, I might get careless somewhere and forget to properly destroy allocated resources.
So, given I can never ever reboot these machines and thus obviate any worries about my program hogging up resources even after it's terminated, what would be a worst case scenario in such a situation? Take into consideration that I might run it again and again and again. Also, are there any preventive measures I can take (e.g. run it only inside a debugger?) or any open source tools designed to prevent this kind of mishap?