I've been getting into memory allocation lately. I wrote a couple of my own (basic) memory allocators and really enjoyed it.
One topic of interest to me is handling large files. Many text/code editors out there, as well as other apps, have lackluster memory management. This is evident by the fact that in the case of the text editors, opening a 500MB text file for example, may make the program hang or crash. In the case of other apps, Slack recently rewrote their chat app to consume FIFTY PERCENT LESS MEMORY with the exact same feature set. However, there are other text editors, such as 010 Editor which are designed for large files.
My question is, what considerations does a C programmer think about when designing a program to handle large file input? Say I wanted to make a text editor that could open up to 10GB sized files, and have a smooth, seamless user interface and search feature. What would I need to do differently in this case versus designing a text editor that only needs to support 1-5MB files?
My first thought for large files is using mmap() but I also assume that the buffer size, chunk size, and more are very influential in the performance of handling large amounts of data. In fact, mmap() seems to be behind malloc anyway so I wonder if just using malloc would suffice since malloc can technically allocate up to like an exabyte+ on 64 bit systems anyway.