Hi, everyone.
It occurs to me that when I am programming on, say OSX with gcc, the operating system designers and gcc writers between them have given me certain means of input and output. For example, I can use printf() to print a message to the screen.
But when I am programming an AVR microcontroller, printf() doesn't have much meaning. There is no screen, and no way that the AVR can "print", unless I implement it myself.
Is this what is meant by the difference between a "hosted" implementation and a "freestanding" one? As in, a hosted implementation gives me certain standard ways of communicating with the environment, and a freestanding one is presumably going to have to rely on something architecture-specific?
Richard