Optimizing large file reading
Ok, so I have created a program that takes user input, checks some data files and does stuff.
My only problem is, my data file is huge. It was nearly a 1 meg text file!
So, I split it up into multiple files, that way it does not have to check every part of the files. Good.
But some of them are still pretty large, and the way the program works in might have to check the files multiple times. All that works fine…but it takes so long.
I am using standard ifsteam. But every time it gets the new line to check it is actually accessing the file. Is there a way to put the entire file into memory and then get each new line from memory; as I imagine that would be way way faster?