Pointers/memory issues

This is a discussion on Pointers/memory issues within the C++ Programming forums, part of the General Programming Boards category; Hi all, I'm about to undertake a fairly large research project and I'll be using a cluster of shared linux ...

  1. #1
    Registered User
    Join Date
    Oct 2005
    Posts
    271

    Pointers/memory issues

    Hi all,

    I'm about to undertake a fairly large research project and I'll be using a cluster of shared linux machines that are rebooted maybe once a year. I do not have superuser privileges on these machines. My previous work in C++ on these machines did not require that I create my own classes or use pointers to them. This time, however, I will have to do that, and while I would like to be able make a flawless program on the first go, I know this is not going to happen, and even if my programs compile without a hitch, I might get careless somewhere and forget to properly destroy allocated resources.

    So, given I can never ever reboot these machines and thus obviate any worries about my program hogging up resources even after it's terminated, what would be a worst case scenario in such a situation? Take into consideration that I might run it again and again and again. Also, are there any preventive measures I can take (e.g. run it only inside a debugger?) or any open source tools designed to prevent this kind of mishap?

  2. #2
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Posts
    23,027
    Run inside a debugger, eg your design studio. When the studio closes, all that memory will be freed usually.
    And take care to use some memory management, like shared_ptr with reference counting, to properly destroy your pointers.

  3. #3
    Registered User
    Join Date
    Oct 2005
    Posts
    271
    Yes, I'm guessing gdb will keep the memory safe. But when I get going and start to run it on my full data set, I'm going to need it to be fully optimized (the g++ -O4 option), and I won't want to run that in a debugger environment.
    As for shared_ptr, it's a boost object, and I can't install it since I don't have root privileges. So the alternative would be auto_ptr, but when I investigated it, it seems it's tricky to create arrays of custom classes (since auto_ptr doesn't work with stl containers), so I'm kind of stuck here.

  4. #4
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Posts
    23,027
    I can lend my own memory manager if such needs be. I made one myself. It kind of imitates a smart shared_ptr.
    Even when the app is release-worthy, you will still run it inside the debugger until you deem it has no bugs severe enough to prevent a release.
    But even so, when a program exits (or crashes), all its resources are freed, so there's really no worry.

  5. #5
    Registered User
    Join Date
    Jan 2005
    Posts
    7,344
    I believe you can download and "install" boost without actually installing anything. You just #include the headers from it.

    No offense intended to Elysia's implementation, but I would see if I could use the almost-standardized smart pointer over a home-grown one.

    Also, pass by value a lot, since you won't have to worry about memory leaks in that case.

    There are programs to help identify memory leaks (Valgrind maybe?) but you might have to install them.

  6. #6
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Posts
    23,027
    Quote Originally Posted by Daved View Post
    No offense intended to Elysia's implementation, but I would see if I could use the almost-standardized smart pointer over a home-grown one.
    Yes indeed, I just mentioned it just in case

  7. #7
    Registered User
    Join Date
    Oct 2005
    Posts
    271
    Daved: Thanks for the pointer (no pun intended) about boost

    Elysia: Thanks for offering to share your code.

    But my real question (I really am curious): what's the worst thing that can happen if I get careless about memory management and just run my programs on and on and on without rebooting the machine? Will I be killing the machine for my lab mates as well?

  8. #8
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Posts
    23,027
    I doubt it will go that far, but if you notice memory getting low, try closing your debugger or shutting down your app and restarting it. That will free all the memory they've been hugging.
    I tend to run my machine on without reboots (since I use hibernation), and I never encounter such problems. Granted, I have a lot of RAM (3 GB), it worked fine before, even when I had less.

  9. #9
    Registered User
    Join Date
    Apr 2006
    Posts
    2,053
    Quote Originally Posted by cunnus88 View Post
    what's the worst thing that can happen if I get careless about memory management and just run my programs on and on and on without rebooting the machine? Will I be killing the machine for my lab mates as well?
    Assuming the OS has proper access checks on your program to make sure it doesn't try to access memory that it shouldn't, and assuming that it properly releases the memory when the program closes, there will be no lasting effects of a memory problems, besides the code not working.

    That is to say, a good secure OS should prevent code with regular permissions from overstepping its bounds.
    Last edited by King Mir; 10-25-2007 at 08:34 PM.
    It is too clear and so it is hard to see.
    A dunce once searched for fire with a lighted lantern.
    Had he known what fire was,
    He could have cooked his rice much sooner.

  10. #10
    Registered User
    Join Date
    Oct 2005
    Posts
    271
    Quote Originally Posted by King Mir View Post
    Assuming the OS has proper access checks on your program to make sure it doesn't try to access memory that it shouldn't, and assuming that it properly releases the memory when the program closes, there will be no lasting effects of a memory problems, besides the code not working.
    Uh oh, are you saying, that in practice, I can be as careless as I want with memory management?

  11. #11
    Registered User
    Join Date
    Apr 2006
    Posts
    2,053
    Quote Originally Posted by cunnus88 View Post
    Uh oh, are you saying, that in practice, I can be as careless as I want with memory management?
    Without fear of harming your computer, yes.

    Now getting your program to do what you want it to do is another story.
    It is too clear and so it is hard to see.
    A dunce once searched for fire with a lighted lantern.
    Had he known what fire was,
    He could have cooked his rice much sooner.

  12. #12
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Certainly, a Linux app that exits will have all it's memory released.

    As long as your process doesn't use so much memory that others start to "notice", you can actually ignore freeing your memory entirely - not a good plan for a large amount of memory allocation, or for long-term running server processes, and it's obviously not "good style" to not free memory.

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  13. #13
    Malum in se abachler's Avatar
    Join Date
    Apr 2007
    Posts
    3,189
    Ick, even a tiny resource leak will crash your program in polynomial time. Im currently debugging an application that is leaking handles. It only leaks a few per second, but thats enough to make it stop functioning after about 15 minutes. With memory, you migth get away with it longer, but eventually it will crash. Inceidentally, it doesnt matter how much physical ram you have, as the program will function right up until it hits the per process limit. The leaked memory, while allocated, will almost certainly be paged. The performance hit comes if the memory is highly fragmented, such that large portions of it cant be paged out due to being in the same page as in-use blocks.
    Until you can build a working general purpose reprogrammable computer out of basic components from radio shack, you are not fit to call yourself a programmer in my presence. This is cwhizard, signing off.

  14. #14
    Registered User
    Join Date
    Oct 2005
    Posts
    271
    Thanks guys. For the moment, it's enough for me to know that (almost) no matter what I do, I won't be maiming the machine for good. My program is not supposed to be a perpetual process, so I can kill it if things get iffy.

  15. #15
    Registered User
    Join Date
    Oct 2005
    Posts
    271
    I had a sudden, "Huh? Come to think of it..." moment.

    If memory is supposed to be completely freed after a program terminates, no matter how badly written it was, what's with all the memory leaks in Windows programs? I mean, you start windows, look at memory consumption, about 80 megs being used, all nice and fine, then you run a few programs, kill all of them, and you check your memory consumption, and it's hovering around 130 megs. Then what's using that 50 megs (or 100 or whatever) when there are no active programs (other than background services)?

Page 1 of 2 12 LastLast
Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Bitmap scroll issues
    By Gerread in forum Windows Programming
    Replies: 4
    Last Post: 05-14-2007, 06:18 AM
  2. Solution to culling issues?
    By VirtualAce in forum Game Programming
    Replies: 4
    Last Post: 03-14-2006, 06:59 PM
  3. Memory Issues
    By Padawan in forum C Programming
    Replies: 12
    Last Post: 04-03-2004, 03:10 AM
  4. Visual Age C++ to MS VC++ conversion issues?
    By Ruchikar in forum Windows Programming
    Replies: 3
    Last Post: 08-10-2003, 10:54 PM
  5. hexdump issues
    By daluu in forum C Programming
    Replies: 2
    Last Post: 03-04-2003, 09:01 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21