Another different-but-related issue that has confused me a bit is what would happen if the string entered into the my_word[20] variable, ended up being longer than the 20 characters defined when the variable was declared? What would happen to the overflowed characters?
As I understand it (and I'm almost certain that I misunderstand it), the extra charcters would get loaded into memory but would be outside the bounds of the defined variable (just after it, in fact). That seems logical enough to me and this was the theory that I started out with when I set out to test it.
I wrote a small program that intentionally overran the size of a variable. I created a variable 10 characters long (my_word[10]) and entered a 12 character word into it ("ABCDEFGHIJKL"). I then printed it out to the screen (In case it makes a difference, I used
scanf() to read the 12 character word from a user input and
printf() to print it to the screen).
Now, I expected the last 3 characters to just go missing or, at the very least, just be garbage but to my surprise, they all came back nicely and printed.
This resulted in some very colourful language on my part because I thought that I had understood how it all worked until this point and was now just as confused as ever.
So can anyone suggest a plausable reason for why it worked? Surely the variable doesn't allow you to enter more chracters than it is defined for. Frankly, I'm stumped.
Thanks,
TV