running out of memory (I think) - how do you use virtual memory?
I have a program that loads images (very large ones - 25MB a piece - about 180 MB total), gathers a bunch of information, then creates three images (about 15 MB a piece) & writes them out to disk.
It's designed to create three images. If I try to run the program on machine A, it dies. If I do it on machine B, it runs to completion (and creates all three images). If I only make the program create one image, it works on both machines.
Machine A has 512 MB RAM. Machine B has 1 GB RAM. Because of this, my guess is that the lack of RAM in machine A is what's causing the problem, however, shouldn't the program just start using virtual memory if it runs out of system memory? Do I have to tell it to do that explicitly? I thought the OS took care of things like that. If I have to do it explicitly, how would I do that (I'm fairly new to Windows programming - I've done almost all Linux/Irix up to now)?
I'm using MSVC .NET on Win XP Pro SP1 (both machines). The program is MFC based.