I want to know how to handle heavy input files that are around 500MB
with normal school way of reading everything into memory wont be good..
and it fails sometimes if the library cant go to that extent..
how is this done in the industry?
what is the efficient way to do it?Example:
Write a program to scan a text file of size ~500MB having sequence of numbers and can be scanned into an array.. using quick sort generate some output. the algorithm reads and writes to the same file..
efficient in the sense it wont get too heavy on memory..
can we force the paging of our program with some defined criteria?
And as the last thing..
what if the file format is not normal txt, but a compressed file having its own specs?
where it is not easy to get something meaningful from inside the file, without going through the starting part to that part