Last edited by _Mike; 09-02-2011 at 08:40 AM. Reason: misspelled count as could :/
Actually we're both wrong, lol. A megabit (100000 bits) is 12500 bytes, which is 12.2 kB (using a 1024 byte kB).
And that difference will probably be compounded on a network, presuming each read requires some kind of in-protocol packet exchange as overhead, which is going to be a number of bytes.
Ie, if you read one byte at a time and there are 9 bytes of packet overhead per request, your 100 MBits/sec connection will be 10 Mbits effectively. If you are reading a line at a time, the lines are ~40 bytes, and the overhead is 8 bytes, you will get at most 80% of the network maximum.
If the files are not huge, stat() them and read the entire thing in at once. Do not use getline().
Last edited by MK27; 09-02-2011 at 08:24 AM.
C programming resources:
GNU C Function and Macro Index -- glibc reference manual
The C Book -- nice online learner guide
Current ISO draft standard
CCAN -- new CPAN like open source library repository
3 (different) GNU debugger tutorials: #1 -- #2 -- #3
cpwiki -- our wiki on sourceforge
Yeah, more than once since I did that on a calculator too. I need a less mathematically impaired brain. :/
Stringstreams also have a getline() function. Read into string buffer, apply stringstream, and all your current parsing code can work with that.
To be honest though, I doubt that will make much difference in terms of getting the network speed to better match the speed of a hard drive -- but it is probably a more polite use of the network & equipment. Imagine if you had three computers reading three different files off the same hard drive at the same time, one line at a time.
Last edited by MK27; 09-02-2011 at 09:07 AM.
C programming resources:
GNU C Function and Macro Index -- glibc reference manual
The C Book -- nice online learner guide
Current ISO draft standard
CCAN -- new CPAN like open source library repository
3 (different) GNU debugger tutorials: #1 -- #2 -- #3
cpwiki -- our wiki on sourceforge