Quote Originally Posted by anon
#include just copies and pastes code for the compiler. In this case it will work, but in a real application where you might want to use HelloWorld() in multiple source files, you'll get multiple definition and possibly other errors.

Anyway, including cpp files would also mean that the order in which and where the file is included matters. So far I have managed to write almost all projects so that this doesn't matter.
Now I see your point. I have slightly changed my example. Normally, one would put all three files in the same directory and compile:

g++ -Wall hellomain.cpp helloextra.cpp -o helloworld.exe

(helloworld.cpp wouldn't be needed, since it's already included in hellomain.cpp)

But you're right. I get a multiple definition error if I do that.

You see, the reason I asked this is mainly because of compilers such as gcc and g++. The conventional way requires you to put all .c or .cpp files in gcc or g++, like so:

g++ -Wall hellomain.cpp helloextra.cpp helloworld.cpp -o helloworld.exe

where the three .cpp files would only include function implementations and not function prototypes. For the prototypes, one would (possibly) have to create three additional files - hellomain.hpp helloextra.hpp and helloworld.hpp with those prototypes. But then I thought: "Why do we programmers have to bother with separating function prototypes from the actual function implementations? Wouldn't it be more convenient to just store them all in single files rather than two files each?"

Really, it kind of sucks. My approach would definitely work if g++ discarded multiple definitions of functions, but I guess I'll just have to be content with what I got...