What do the integer 0 correspond to in ascii and unicode?
Can i use the integer 0 to "terminate" my char[] inputbuffers?
What do the integer 0 correspond to in ascii and unicode?
Can i use the integer 0 to "terminate" my char[] inputbuffers?
http://livebad.com/nuka
Da-Nuka
ASCII table
not sure about your second question though.
My Tutorials :
- Bad programming practices in : C
- C\C++ Tips
(constrcutive criticism is very welcome)
- Brain Cell
you can terminate with '0' when you write your own functions who do the right thing then.
some functions accept a delimiter symbol (i think it was read and write (stream io)) .
basically it depends on how a function "interprets" a symbol. if it interprets it the way you expect it to interpret then its ok. problem arise when there are differences between what the function does and what you excpect it to do
note that you dont need to terminate arrays.
but all string functions of the c library expect strings to be terminated by a '\0' character.
signature under construction
0 is an int. '0' is the char zero (or maybe the char capital Oh, I can seldom tell the difference). '\0' is the char representation of the null char. Therefore 0 and '\0' can be used interchangeably.
You're only born perfect.