Thread: Concerning delete/delete[] at program exit

  1. #1
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413

    Concerning delete/delete[] at program exit

    Stroustrup begins his answer to the his FAQ on How do I deal with memory leaks? by saying:
    By writing code that doesn't have any. Clearly, if your code has new operations, delete operations, and pointer arithmetic all over the place, you are going to mess up somewhere and get leaks, stray pointers, etc. This is true independently of how conscientious you are with your allocations: eventually the complexity of the code will overcome the time and effort you can afford. It follows that successful techniques rely on hiding allocation and deallocation inside more manageable types. Good examples are the standard containers. They manage memory for their elements better than you could without disproportionate effort.
    Well and good. At the same time, I have heard the argument that since (most of the common) operating systems reclaim memory used by a program when it quits, it is not necessary to use delete/delete[] on objects created with new/new[] that last till the end of the program, unless their destruction does something that the operating system will not do.

    After reading through several examples in the FLTK documentation, I assumed that either FLTK handled the freeing of memory, or the examples were relying on the operating system to do the job. Some searching brought me to a thread (WTF new -> delete where a user had the same concerns.

    The responses basically went over this issue, though some took the position that deletes at program exit were neither necessary nor harmful, others stated that it was not necessary and potentially harmful, and the thread starter insisted that it was necessary.

    So, what is your take on this?
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  2. #2
    and the hat of sweating
    Join Date
    Aug 2007
    Location
    Toronto, ON
    Posts
    3,545
    Well what if the memory is allocated in shared memory?

    Also, if you run a tool that checks for memory leaks, it will probably find your intentional leak and cause some managers to assume the worst...

    If your code is modified several years down the road, and now your pointer(s) lifetimes are changed, the person making the changes might not realize you forgot to delete it and then you end up with a real leak...

  3. #3
    Algorithm Dissector iMalc's Avatar
    Join Date
    Dec 2005
    Location
    New Zealand
    Posts
    6,318
    I wouldn't buy any argument about it being "potentially harmful". That can only be just nonsense.
    I think that the best option is always to try and free everything upon exiting. If you use leak detection tools, then it's just so much easier to spot the single new leak at the time you add it instead of having to wade through 1000 leaks to see find one which is new. The chances of having a large number of one-off unimportant leaks is pretty small anyway.
    My homepage
    Advice: Take only as directed - If symptoms persist, please see your debugger

    Linus Torvalds: "But it clearly is the only right way. The fact that everybody else does it some other way only means that they are wrong"

  4. #4
    Registered User
    Join Date
    Apr 2006
    Posts
    2,149
    My take is that it is ok for non-beginners to not release memory at program termination, if there is a good reason, namely it would lead to much simpler code.

    In C, I could see an argument where it is much simpler to call exit(1) in a situation where a terminal error arises, rather than build in an exception propagation system into every function to ensure that memory is freed. In C++, exception propagation is built in and the use of smart pointers allows for all memory to be easily freed when an error occurs. So in regular C++ I can think of no reason not to release memory on exit. Of course my lack of creativity does not imply that no reason exists, I just can't think of one.
    It is too clear and so it is hard to see.
    A dunce once searched for fire with a lighted lantern.
    Had he known what fire was,
    He could have cooked his rice much sooner.

  5. #5
    and the hat of int overfl Salem's Avatar
    Join Date
    Aug 2001
    Location
    The edge of the known universe
    Posts
    39,659
    > The responses basically went over this issue, though some took the position that deletes
    > at program exit were neither necessary nor harmful, others stated that it was not necessary
    > and potentially harmful, and the thread starter insisted that it was necessary.
    There are two kinds of allocation which you need to watch out for:
    - allocations in response to some kind of event always need to be freed, otherwise they will erode the pool of available memory until there is no more and the program crashes.
    - allocations at start-up which are held for the life of the program. In most(*) cases, it is OK to let the OS catch these, though there is no need to be lazy about it IMO.

    I agree with iMalc, it's a lot easier to spot new (and potentially serious) leaks if you keep the program pretty clean to begin with.

    (*) Your average consumer-level desktop OS would be fine. An embedded RTOS without memory management would fare a lot worse.

    A lot of desktop programs seem to leak something or other if you run them for long enough. My guess is that the assumption is the user will quit the application or turn off the machine at some regular interval. But in say a data centre, or an embedded platform such as a mobile phone, people expect months of continuous "uptime", so there is no room for even the smallest leak.
    If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
    If at first you don't succeed, try writing your phone number on the exam paper.

  6. #6
    The larch
    Join Date
    May 2006
    Posts
    3,573
    I don't know about FLTK, but in wxWidgets controls are allocated with new but delete is rather rare: descendants will be released by parents, and the parent frame would be freed with the Destroy member function.

    So this might be a library peculiarity.
    I might be wrong.

    Thank you, anon. You sure know how to recognize different types of trees from quite a long way away.
    Quoted more than 1000 times (I hope).

  7. #7
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Never let any memory you allocated go unfreed, is my take on everything. Of course, it can't be helped if an error occurs and the program is killed implicitly (runtime error, for example), but you should nevertheless avoid such a thing.
    In C++, this argument is void due to smart pointers and RAII; no resource should ever be leaked. If I find a memory leak in my program, regardless if it's due to not calling delete at memory when exiting or if it's just a subtle leak somewhere else that I missed to free, I hunt them down and fix them.

    When you are in charge of memory allocation yourself, it's not the time to be lazy. All memory you allocate should be freed, even if it's automatically freed when the program exits. This is just so sloppy programming...
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  8. #8
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413
    If your code is modified several years down the road, and now your pointer(s) lifetimes are changed, the person making the changes might not realize you forgot to delete it and then you end up with a real leak...
    Yes, this echos Stroustrup's point but with the issue of long term maintenance, though unfortunately the anonymous user who started that thread never mentioned it.

    I wouldn't buy any argument about it being "potentially harmful". That can only be just nonsense.
    Greg Ercolano claimed otherwise:
    But in actual practice, when you know the OS is working for you,
    you can actually save quite a bit of backflipping by leaving the OS
    to do the delete unless absolutely necessary.

    Write a few hundred programs, and you begin to realize the delete's
    aren't needed, and can sometimes cause program exit time to be
    so significant, that it's actually better NOT to call all the destructors.
    Some programs are so pendantic about calling destructors, that when you
    hit "File|Quit", they take minutes to exit. Whereas just exit()ing the
    program takes seconds, deallocating everything in one shot.
    Frankly, I never experienced the vague anecdotal evidence that he provided. I have used programs with slow startup, not exit, time, especially Java programs

    I don't know about FLTK, but in wxWidgets controls are allocated with new but delete is rather rare: descendants will be released by parents, and the parent frame would be freed with the Destroy member function.

    So this might be a library peculiarity.
    Yes, that appears to be FLTK's behaviour. However, this is a separate point from delete at exit since this concerns delete before exit.

    Never let any memory you allocated go unfreed, is my take on everything.
    Indeed, but these guys are saying that the OS is going to free the memory you allocated, so all is well. Note of course that the operating systems they have in mind are not some embedded RTOS without memory management as Salem mentioned. If you would raise this issue (as is my normal argument for always matching new/new[] with delete/delete[] respectively) I expect that they would dismiss it with 'every operating system (that FLTK deals with) keeps track of all allocations and frees those without the need to call "delete"'.

    When you are in charge of memory allocation yourself, it's not the time to be lazy. All memory you allocate should be freed, even if it's automatically freed when the program exits. This is just so sloppy programming...
    How would you address Ian MacArthur's claim that:
    As has been explained elsewhere in this thread, any objects that are to be deleted on program termination will be automatically tidied away by the operating system when the program exits.

    Therefore, you only have to call the destructors for objects that are created and deleted dynamically as your program executes, or for objects whose destructor performs some action that the O/S can't or won't do for you on termination. (The standard fltk objects do not fall into that category.)

    So you can call them if you want to. And they will work.
    But if you do, your program will take a little longer to shut down, and it will make no difference at all to the O/S. There are no known resource leaks in 1.1.7.

    I don't know what you have been taught, or what you have read, but this actually does constitute correct and proper practice.
    Note that the emphasis is mine for the bold text.
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  9. #9
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    I have never seen any evidence to suggest that normal memory allocations in an operating system like Linux or Windows would ever cause any problem by not being freed. The "exit()" function [or the system call that it results in] should take care of closing files, deallocating memory allocations, decrementing reference counts for shared resources and such, including device drivers.

    If this was not the case, all applications that crash would lead to memory leaks, resource leaks, etc.

    Unfortunately, there are drivers that don't "play nice", so sometimes, terminating a driver connection "the correct way" is necessary to prevent problems.

    But for common memory allocations and such, the only real problem with not deleting or freeing the memory is if you continualy allocate memory that is never used again, you will eventually run out of memory. This is more of a problem with long-running applications. Similarly, of course, if the creation of an object binds some system resource, you can't just allocate more and more of them, or the system will (probably) eventually run out of that type of resource.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  10. #10
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by laserlight View Post
    Greg Ercolano claimed otherwise:

    Frankly, I never experienced the vague anecdotal evidence that he provided. I have used programs with slow startup, not exit, time, especially Java programs
    That's just nonsense.
    If your app takes a long time to shut down, why not just hide the GUI and let the program finish its work in the background? You could even set the thread priority to Idle to let it go unnoticed. I don't buy that argument that you shouldn't let the app handle cleanup if it takes too long time. That's just a sloppy argument.

    Indeed, but these guys are saying that the OS is going to free the memory you allocated, so all is well. Note of course that the operating systems they have in mind are not some embedded RTOS without memory management as Salem mentioned. If you would raise this issue (as is my normal argument for always matching new/new[] with delete/delete[] respectively) I expect that they would dismiss it with 'every operating system (that FLTK deals with) keeps track of all allocations and frees those without the need to call "delete"'.
    POC to that. Who says the OS doesn't need to clean up what the application doesn't? Let's think about it for a while.
    If an application cleans up everything, it needs to free all memory that it owns, etc. It may take some time.
    But if your app doesn't do this and just quits, then the OS must clean up everything from the application, which must take just about as much time. So there's no gain there, is it?

    How would you address Ian MacArthur's claim that:

    Note that the emphasis is mine for the bold text.
    I don't know who has taught Ian that, but I don't consider it proper practice!
    It does make a difference to the OS, if you want to argue, because if YOU don't clean up, IT has to, so in the end, there's no benefits to having the OS do it.

    There are further bad practices in doing this, as well. Memory leak programs will catch these leaks. You'll get bombarded with leaks when using MFC, for example, since it automatically dumps memory leaks when a debugging session is finished if there was any.
    Not to mention, what if there's a bug in the OS or it can't handle it very well?
    *cough* Windows 95 *cough*
    And what if this code is going to be ported to an OS which does not do this?
    Since this is a forum that teaches portability, I would further argue that you clean up everything you use. No excuses such as "the OS does it for me." Now that sounds like a VB programmer who does not declare variables before using them or just use variants for everything. Argh, I hate those.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  11. #11
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    But if your app doesn't do this and just quits, then the OS must clean up everything from the application, which must take just about as much time. So there's no gain there, is it?
    Not true. If, for example, you have memory allocated, and you deallocate it, the allocator must still maintain the integrity of its management data structures through every single deletion.
    If, on the other hand, the OS frees the stuff, it simply sets all memory pages used by your app to "free" again. Much faster.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  12. #12
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    If the OS was designed that way, yes. But you can't be sure of that.
    And just leaving it to the OS to do is what I find sloppy, just like a VB programmers.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  13. #13
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Elysia View Post
    If the OS was designed that way, yes. But you can't be sure of that.
    And just leaving it to the OS to do is what I find sloppy, just like a VB programmers.
    In the OS's view, memory exists in chunks of 4KB or multiples thereof - in fact, the Windows heap is created with "HeapCreate()", and the destruction of that can easily be done by just marking all it's pages as free - that's much quicker than walking all over the place calling destructors that delete member variables, and THEN freeing the entire heap anyways.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  14. #14
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Not calling destructors is very bad practice. This is another reason I'd put for explicitly deleting all data inside the program.
    Calling the destructors but not freeing the actual objects is also bad programming practice since these objects will be left in a zombie state.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  15. #15
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Quote Originally Posted by Elysia View Post
    Not calling destructors is very bad practice. This is another reason I'd put for explicitly deleting all data inside the program.
    Calling the destructors but not freeing the actual objects is also bad programming practice since these objects will be left in a zombie state.
    Well, we're assuming that we have done the necessary work to "finish" the program. What is, then, the problem with just exiting? What "good" will calling all the destructors actually do? Unless the OS [or some components like drivers] is buggy, the OS should free anything that the application "holds" - there is no need to delete every single object.

    Note, I'm not saying this is how I would do things in my programs - in fact, at work, we have a heap-tracker that tracks every allocation, and REQUIRES that we free everything - this is tested in the test-code for each individual test-case.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. HELP with Program!
    By afnitti in forum C Programming
    Replies: 9
    Last Post: 04-15-2009, 08:06 PM
  2. exit a program at any point.
    By slightofhand in forum C Programming
    Replies: 5
    Last Post: 03-02-2008, 09:08 AM
  3. Program Terminating With Error On Exit
    By chriscolden in forum C Programming
    Replies: 19
    Last Post: 01-14-2006, 04:40 AM
  4. Program uses a lot of memory and doesnt exit properly
    By TJJ in forum Windows Programming
    Replies: 13
    Last Post: 04-28-2004, 03:13 AM
  5. My program, anyhelp
    By @licomb in forum C Programming
    Replies: 14
    Last Post: 08-14-2001, 10:04 PM