Well in conclusion here i tried a buffered version and effectively its the same speed over the network, maybe a little quicker. I ended up using the char buffer directly rather than via an istringstream as it seemed to be taxing the memory of my pc and going a bit bonkers otherwise.
The data was a .csv of
file size 494mb!
1.7million rows
Once the data is in the buffer then it outputs the data in 2 or 3 seconds, but the loading time prior to this is about 56 seconds, so all told i think this is about the same as my original version just looping until ifstream eof and getline direct from ifstream. Obviously the buffer version is much better if i was to go on and do some parsing or needed to reread etc.
Here is the relevant bit of code i decided to use:
Code:
//opened file ok....
//..
cout << "\n\nLoading file..\n";
int length = 0;
inFile.seekg (0, ios::end);
length = inFile.tellg();
inFile.seekg (0, ios::beg);
char* buffer = NULL;
buffer = new char[length];
inFile.read(buffer, length);
inFile.close();
cout << "\n\nCounting rows..\n";
int count = 0;
for(int i = 0; i < length; i++)
{
if(buffer[i] == '\n')
count++;
}
cout << "\n\nTotal Rows: " << count << "\n\n";
delete[] buffer;