Copy Large (possible Open) files kill processor
I want to automate a simple large binary file backup (copy changed files to my spare drive) but the process of copying larger binary backups (DB Dumps) absolutely hammers my processor. It is quite obvious to me what is happening - the program has a job to do and uses all the processor to do it.
What I want it to do is allow me to limit how much of my processor the copy-process can utilise.
Does anyone have a way of initialising the copy process on its own thread, and then 'pushing' the thread to the background.
My current code looks like:
long filesize = sourceFile.GetLength();
dwRead = sourceFile.Read(buffer, blocksize);
while (dwRead > 0);