File I/O in large data file
I'm not requesting for codes but I'm currently stuck at one section of my work and in need of an algorithm or way to overcome this.
I need to read a file that contains records which I've already done by the simple File I/O functions but now I need to read at least 1.5million records in order to do data sorting. I'm unfamiliar with memory management as I think 1.5million records is huge. I've constructed a struct to store a record each and as a linked-list structure but taking the time to read/write and sort into consideration it'll take days to run it. I tried initializing array[1500000] to give a try but it seems to end up as a core dumped.
Therefore, I'm requesting for an algorithm, a pseudo code or any ideas to overcome this. Codes would be welcomed but I'm not asking too much as I know it is wrong for someone to do someone else's work for free.