Thread: Concerning delete/delete[] at program exit

  1. #16
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413
    I don't know who has taught Ian that, but I don't consider it proper practice!
    Has any expert say, at work, considered it otherwise? Note that the general response in the thread I linked to considered it proper practice.

    And what if this code is going to be ported to an OS which does not do this?
    Since this is a forum that teaches portability, I would further argue that you clean up everything you use.
    That we are usually concerned with portability here does not matter. Consider that I might use the arguments in this thread and start a thread in the FLTK mailing list. I certainly cannot say "since one of FLTK's goals is portability" since they would point out that "FLTK is a cross-platform C++ GUI toolkit for UNIX/Linux (X11), Microsoft Windows, and MacOS X".

    No excuses such as "the OS does it for me." Now that sounds like a VB programmer who does not declare variables before using them or just use variants for everything.
    First, you have to establish that it is an excuse. Some of these FLTK users are saying that it is deliberate as a kind of optimisation.

    Not calling destructors is very bad practice.
    How is not calling destructors very bad practice if they have no side effect other than to free memory that would otherwise be freed by the OS just afterwards?

    Calling the destructors but not freeing the actual objects is also bad programming practice since these objects will be left in a zombie state.
    How does this matter if the program has exited?
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  2. #17
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by laserlight View Post
    Has any expert say, at work, considered it otherwise? Note that the general response in the thread I linked to considered it proper practice.
    Yes, they do seem to find it proper practice, and they're entitled to believe what they want, and so am I, and I don't find it proper practice. Unless someone can point to a real source where it specifically says it's good programming practice.

    That we are usually concerned with portability here does not matter. Consider that I might use the arguments in this thread and start a thread in the FLTK mailing list. I certainly cannot say "since one of FLTK's goals is portability" since they would point out that "FLTK is a cross-platform C++ GUI toolkit for UNIX/Linux (X11), Microsoft Windows, and MacOS X".
    They obviously do not listen to anyone who says they should explicitly call delete, and frankly, it's their project, so they're entitled to do what they want and not listen to anyone else.
    To tell the truth, I'm sick of their attitude when I read about they not explicitly calling delete and saying it's good practice.
    This may just be me, but I'll definitely stay away both from that forum and from their project. Not that it does me any good or them any harm, but I don't like it and thus I won't use it. Simple.

    First, you have to establish that it is an excuse. Some of these FLTK users are saying that it is deliberate as a kind of optimisation.
    It's an optimization only if the world was made of ice.
    It reflects my own opinions and not theirs.

    How is not calling destructors very bad practice if they have no side effect other than to free memory that would otherwise be freed by the OS just afterwards?
    Why do we have destructors?
    Who says destructs are limited only to freeing resources?
    I can do so much more, such as flushing some memory to a file, for example.
    So if the destructor isn't called... the file data is never written.
    I could find this to be a valid scenario since file writing is slow and thus I'd like to keep all data in memory and flush it only when necessary, since the code might be time critical.

    Oh right, and if the class is actually dervied and work is performed in those destructors? Uh-oh...

    The point is, instead of worrying for some bugs and causing headaches for those who want to use and derive that code, why not just save everyone trouble and explicitly call delete? No worrying bugs about destructors not being called!

    How does this matter if the program has exited?
    The program does not exit when a destructor is called. It will exit AFTER it's called. And after that destructor, more code is possible executed and perhaps this "zombie state" object gets used somehow.
    It's a possibility.

    And btw, I have an analogy for this whole OS clean up argument. No, it may not be any good, but it sure annoys me.
    If your job is to cut grass and you actually have to pick up all the crap people has thrown on that grass because they didn't do something as simple as toss it into a recycle bin, wouldn't you be annoyed to, especially since it isn't really part of your "job"?
    I sure would be.
    And that is why, everyone should toss stuff into the recycle bin and not leave it for someone else to clean up. Aka, the program should delete any resources it acquires and not leave it to the OS to clean up.
    Last edited by Elysia; 01-03-2008 at 08:49 AM.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  3. #18
    The larch
    Join Date
    May 2006
    Posts
    3,573
    Who says destructs are limited only to freeing resources?
    I can do so much more, such as flushing some memory to a file, for example.
    So if the destructor isn't called... the file data is never written.
    If the destructor does something you need, surely you will delete that object.

    The program does not exit when a destructor is called. It will exit AFTER it's called. And after that destructor, more code is possible executed and perhaps this "zombie state" object gets used somehow.
    It's a possibility.
    So if freeing memory may create zombie objects, wouldn't it be better not to delete them?

    If your job is to cut grass and you actually have to pick up all the crap people has thrown on that grass because they didn't do something as simple as toss it into a recycle bin, wouldn't you be annoyed to, especially since it isn't really part of your "job"?
    And if you know that a machine will come and clean away the grass after you have left?

    And again, I don't think anyone suggests optimizing away necessary operations.
    I might be wrong.

    Thank you, anon. You sure know how to recognize different types of trees from quite a long way away.
    Quoted more than 1000 times (I hope).

  4. #19
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by anon View Post
    If the destructor does something you need, surely you will delete that object.

    So if freeing memory may create zombie objects, wouldn't it be better not to delete them?
    That was my point - delete everything so you avoid subtle but nasty bugs.
    Good programming practice.
    And if the authors didn't delete the object because they didn't need to - what happens if someone else derives that object but doesn't realize the framework doesn't delete the base object? Nasty bug right there.

    And if you know that a machine will come and clean away the grass after you have left?
    And again, I don't think anyone suggests optimizing away necessary operations.
    I know it won't, but it's just the WHY I always like to think that you are responsible for cleaning up that which you create or get.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  5. #20
    and the hat of sweating
    Join Date
    Aug 2007
    Location
    Toronto, ON
    Posts
    3,545
    I agree with Elysia.
    Pointers aren't the only thing that destructors clean up. I've seen several WinAPI functions that have big NOTES in them about calling the corresponding cleanup function when you're done to avoid bad things happening. I don't remember which functions they were, but they're probably not that hard to find.

    Also, even if a destructor only deletes memory today, tomorrow it might be updated to do other types of cleanup that aren't handled automatically by the OS when the program exits.

  6. #21
    Registered User
    Join Date
    Apr 2006
    Posts
    2,149
    I think the FLTK have a good point. It is an optimization to not free memory. As an optimization, it should not be made prematurely -- the coder needs to be sure that there is a potencial slowdown for the destructor calls and that there is no chance for a leak. But like all trade offs, theres a time and place for either option.

    Their attitude, that not deleting is default practice, however, seems questionable. If they are making an optimization, they should be conscious of it.
    It is too clear and so it is hard to see.
    A dunce once searched for fire with a lighted lantern.
    Had he known what fire was,
    He could have cooked his rice much sooner.

  7. #22
    Algorithm Dissector iMalc's Avatar
    Join Date
    Dec 2005
    Location
    New Zealand
    Posts
    6,318
    I had no idea this thread was going to be concerned with abnormal program termination.
    Sure, if something awful happens, I'm fine with a process terminating itself abruptly. This should not be the norm though.

    If a process does take a while to run all its destructors to close the program then I'd say some profiling would be in order. In all liklihood there'll be a destructor for one type of object that takes extraordinarily long, and optimising it would speed the program up a lot in general. I don't think that purely deallocation would ever take very long at all. Even if it did, it would have to be due to a massive number of small allocations that should be being done through a pool allocator for a big general program speed improvement anyway.

    I know Visual studio often takes some time to close, but I think the IDE is poorly optimised, and does stuff at closedown that probably shouldn't be done.

    I suppose the memory freeing could be a release-build-only optimisation. The main thing is to catch all leaks in a debug build.
    My homepage
    Advice: Take only as directed - If symptoms persist, please see your debugger

    Linus Torvalds: "But it clearly is the only right way. The fact that everybody else does it some other way only means that they are wrong"

  8. #23
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Visual Studio is .NET, which probably explains why it takes so long to close due to the stupid framework having to clean up and shutdown, as bloated as it is.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  9. #24
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by Elysia View Post
    If the OS was designed that way, yes. But you can't be sure of that.
    An OS that doesn't clean up memory from terminated processes is a broken OS. The story is different, of course, on embedded systems, where the line between OS and application is blurred.

    (Having said that, I stand on the "always delete everything" side of the fence.)

  10. #25
    Registered User
    Join Date
    Oct 2001
    Posts
    2,129
    I agree with Elysia, cpjust, and iMalc. Except for the release optimization thing. I expect a well-written program to clean up after itself.

    I see the OS cleaning up after programs are finished running is something of a patch. It was made because buggy programs didn't release all their memory and/or resources, and it was necessary to make a stable OS platform. I don't think a patch should be relied on.

    As far as the destructors not being called, that's just stupid. Some reasons have been been explained above. (inheritance... why rely on not using inheritance when the code allows it to be done?)

    As far as not calling delete in the examples, that's stupid because the examples are meant to show how to use the code in a larger program, not how to make an optimized example program.

    If they do insist on doing that, then I would suggest putting it in compiler/OS specific code, and offer code that uses delete as well. I would prefer it if the code that used the delete were the default.

    Like Elysia, I don't think I'll be using it in my real world programs if it doesn't clean up after itself, or won't do so after some minor inheritance.

  11. #26
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by brewbuck View Post
    An OS that doesn't clean up memory from terminated processes is a broken OS. The story is different, of course, on embedded systems, where the line between OS and application is blurred.
    Indeed! But perhaps it is an embedded system. OR something else that just won't clean up. It's better to just explicitly release everything so you are sure you won't be creating bugs on purpose.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  12. #27
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by robwhit View Post
    I see the OS cleaning up after programs are finished running is something of a patch. It was made because buggy programs didn't release all their memory and/or resources, and it was necessary to make a stable OS platform. I don't think a patch should be relied on.
    The stability of the OS should not depend on the behavior of user applications. That's the entire point of having an operating system. It's not a patch, it's a fundamental step forward.

  13. #28
    Registered User
    Join Date
    Apr 2006
    Posts
    2,149
    Quote Originally Posted by Elysia View Post
    Indeed! But perhaps it is an embedded system. OR something else that just won't clean up. It's better to just explicitly release everything so you are sure you won't be creating bugs on purpose.
    Yes, but if you're writing something that is not to be used on an embedded system and have determined that your program does not leak, would it not be safe to optimize program termination? Many programs I use take a few seconds to shut down, and I do not see that as a plus.

    The optimization especially makes in an environment where you want the application to be invulnerable to a power outage or OS crash.

    Of course a big downside to sudden termination is that having exit(1) buried deep within you code makes your code hard to expand later to deal with those errors in other ways (such as by writing to the log), particularly if it is somebody else expanding your code.
    Last edited by King Mir; 01-04-2008 at 11:37 PM.
    It is too clear and so it is hard to see.
    A dunce once searched for fire with a lighted lantern.
    Had he known what fire was,
    He could have cooked his rice much sooner.

  14. #29
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    Quote Originally Posted by brewbuck View Post
    The stability of the OS should not depend on the behavior of user applications. That's the entire point of having an operating system. It's not a patch, it's a fundamental step forward.
    It's a patch! To keep the OS stable even if an app crash or does weird things.
    Of course the OS needs to do that or we'd be back in the Windows 95 days.

    Quote Originally Posted by King Mir View Post
    Yes, but if you're writing something that is not to be used on an embedded system and have determined that your program does not leak, would it not be safe to optimize program termination? Many programs I use take a few seconds to shut down, and I do not see that as a plus.
    No, I disagree. Even if programs take time to close down, they can do so in the background.

    The optimization especially makes in an environment where you want the application to be invulnerable to a power outage or OS crash.
    How so?
    I don't see how it matters if the OS or the app cleans up memory in case of a crash--it's the same.
    (And here's another side argument for a later time - a lot of applications actually save settings upon exit, making you lose everything if they crash, which I find very annoying; me, on the other hand, save settings as soon as they're made!)

    Of course a big downside to sudden termination is that having exit(1) buried deep within you code makes your code hard to expand later to deal with those errors in other ways (such as by writing to the log), particularly if it is somebody else expanding your code.
    And those are big downsides. There is no gain to the optimization of letting the OS do the work (the app is exiting, so the user doesn't have to interact with it anymore!) and it's more error-prone too.
    And hey... what if the program DOES require interaction when it quits? Bye-bye to the OS patch-up instead of the program.
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  15. #30
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by Elysia View Post
    It's a patch! To keep the OS stable even if an app crash or does weird things.
    Of course the OS needs to do that or we'd be back in the Windows 95 days.
    This opinion is frankly ridiculous. There are two sides to this question:

    1) Should the OS reclaim user program memory when programs terminate? YES.
    2) Should the user program itself rely on this behavior? NO.

    The answer to question 2 is not "no" because we aren't sure the OS will reclaim the memory -- of course it will. The answer is "no" because it's good design. Those objects presumably have destructors, and those destructors should run.

    Calling the difference between DOS (a system without memory protection) and any modern OS a "patch" is freaking nuts. The only reason DOS itself didn't have memory protection is because the home computer hardware at the time didn't support it. This idea was not invented in 1995.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. HELP with Program!
    By afnitti in forum C Programming
    Replies: 9
    Last Post: 04-15-2009, 08:06 PM
  2. exit a program at any point.
    By slightofhand in forum C Programming
    Replies: 5
    Last Post: 03-02-2008, 09:08 AM
  3. Program Terminating With Error On Exit
    By chriscolden in forum C Programming
    Replies: 19
    Last Post: 01-14-2006, 04:40 AM
  4. Program uses a lot of memory and doesnt exit properly
    By TJJ in forum Windows Programming
    Replies: 13
    Last Post: 04-28-2004, 03:13 AM
  5. My program, anyhelp
    By @licomb in forum C Programming
    Replies: 14
    Last Post: 08-14-2001, 10:04 PM