-
Large Binary File Copy
I am setting up a dll that automatically creates a date/time stamped copy of my Database Backup before I overwrite it with a new backup. As the DB backup has got bigger, the copy process has started to take over my processor for the 10-minutes it takes to do the copy (backup is now in the 2Gb size).
Is there a way I can change the code below to run as a 'background' process, or a way to restrict the amount of PC resource it can grab?
I am currently using a very simple:
Code:
do
{
dwRead = sourceFile.Read(buffer, blocksize);
destFile.Write(buffer, dwRead);
}
while (dwRead > 0);
-
You could try copying the file in contiguous, but seperate, blocks with a small sleep between each block copy. I would recommend a block size of around 32KB, but you may need to experiment with this.
Your whole copy process will then take considerably longer than 10 mins, but it shouldn't hammer your CPU as much.
-
Thanks
Thanks Davos. I originally had a sleep but thought that was 'lazy' coding - will relook at it now.
Is there any good reason for using a particular block size? You suggest 32k. Is there a reason for that size (or is it just experience)?
-
If you are reading the contents of a file into memory in order to write it to another file, using a block copy is advantageous because you do not load the whole contents into memory at once. If this is happening with your 2GB file copy, your OS will start using virtual memory (and dumping things to swap file itself).
Your OS will be implementing its own read-ahead block operations. So it could be beneficial to chose your block size to match that used by the OS. Normally this is around 64KB on most operating systems. But I would experiment a bit.