Handling Large Amounts of Data
I was hoping to get some peoples opinion on something. I am not new to C++/C programming but by no means am I an expert. I am a physics student who uses it in his research. I have an extremely large Data set which I want to visualize. I have the graphics portion of the code done, I just need a way to handle reading all the data into the program. The data is organized into 100 different files (each file essentially represents a frame); within each file there is 1000 rows and 11 columns. Reading this data in, normalizing it and displaying it file by file is not working (Maybe I am just implementing it inefficently) and reading it all in at the begining seems counter productive and slow. Does anyone know of an efficent way to handle this much information within a program?