read full file to string in one go
as a casual c++ user i recently stumbled upon a problem, which i didn't expect to be one at all. something as trivial as reading a file at once turned out to be not so trivial after all in c++.
in c i'd simply use fread and be done with it. for c++ however there doesn't seem to be such a thing. i had a look around here and searched for "read file" as well as across the net. i found a bunch of approaches and most of them are summarized there -> Insane Coding: How to read in a file in C++.
the interesting ones are:
Code:
std::string get_file_contents(const char *filename)
{
std::ifstream in(filename, std::ios::in | std::ios::binary);
if (in)
{
std::string contents;
in.seekg(0, std::ios::end);
contents.resize(in.tellg());
in.seekg(0, std::ios::beg);
in.read(&contents[0], contents.size());
in.close();
return(contents);
}
throw(errno);
}
and
Code:
std::string get_file_contents(const char *filename)
{
std::ifstream in(filename, std::ios::in | std::ios::binary);
if (in)
{
std::ostringstream contents;
contents << in.rdbuf();
in.close();
return(contents.str());
}
throw(errno);
}
the first one looked good at first. "in.read(&contents[0], contents.size());" works, but only for the raw content. the counters of the string however are not updated and hence size() for example reports 0 and many member functions such as find() don't work. so that one was off the table.
the second one (and in fact all others i found) seems to require a copy to a string and that's where the catch is. my files are text but quite large, as in several gigabytes, so a copy, which doubles the ram usage, is simply not an option. and any line-by-line (getline and alike) based approaches neither of course.
and now i'm wondering, is there really no direct way of getting a text file into a string via the standard library? i remember even roguewave's RWCString had a readFile member function in the 90s.