> I've malloced gigabyte size buffers many times with no problems
If you wanted this data in memory for a long time, then that would be fine.
But copying a 1GB file by allocating one large block of memory is basically a big waste of resources (and possibly even slower than you imagine).
For one thing, few operating systems will commit 1GB of REAL memory to a process which has only written to each page only once. Which means a large chunk of that memory you just loaded with file data is going straight back out to the swap file.
Then it comes back in from the swap file and out to the place where you wanted to save it.
It's also prone to random failure. One day you might be able to allocate enough space for a particular file. But maybe tomorrow, there is an extra process on the OS, or the file is just a little bit larger, and your attempt to allocate one stupidly large buffer returns NULL instead.
What's your error message to the user at this point?
"Sorry, unable to copy file due to ..."
Try this
Code:
unsigned char buf[BUFSIZ];
size_t n;
while ( (n=fread(buf, 1, sizeof(buf), in ) > 0 ) {
size_t r = fwrite( buf, 1, n, out );
if ( r != n ) {
// some error message, probably no space left
// use ferror
break;
}
}
No malloc/free issues, no swap file thrashing, works for files of any size.
And don't play that "It's only a small MP3 file" crap, because you know some doofus will just look at that code and think "Hey, it worked for MP3, what about AVI files?".