Originally Posted by
GokhanK
I am trying to fully understand details about constants.
1. What is a multibyte character and what is a wide character?
Multibyte characters are usually a single byte... ASCII with extensions for accent marks.
Wide characters, also called Unicode are usually 2 bytes, allowing up to 65.535 different characters.
What is the difference between 'A' and L'A'
'A' is a single byte character.
L'A' is a unicode character.
(The same appies to strings, btw)
2. The book I have says:
By default, the compiler fits a numeric constant into the smallest compatible data type that will hold it. Therefore, assuming 16-bit integers, 10 is int by default, but 103,000 is a long int.
What should I understand from that. When does the compiler fit the numeric constant, won't I always define the data type?
Basically you should always define types, most compilers will error off if you don't.
What you should take from this is that smaller numbers fit into larger variables, but not the other way around.
Code:
int x;
char y;
y = 100;
x = y; // no problem
x = 1000;
y = x; // will cause an overflow.
3. When should I use suffixes F,L and U?
[/quote]
I use them as very convenient typecasts for literal numbers double x = 1000f; etc.