Multibyte characters are usually a single byte... ASCII with extensions for accent marks.
Originally Posted by GokhanK
Wide characters, also called Unicode are usually 2 bytes, allowing up to 65.535 different characters.
'A' is a single byte character.
What is the difference between 'A' and L'A'
L'A' is a unicode character.
(The same appies to strings, btw)
Basically you should always define types, most compilers will error off if you don't.
2. The book I have says:
By default, the compiler fits a numeric constant into the smallest compatible data type that will hold it. Therefore, assuming 16-bit integers, 10 is int by default, but 103,000 is a long int.
What should I understand from that. When does the compiler fit the numeric constant, won't I always define the data type?
What you should take from this is that smaller numbers fit into larger variables, but not the other way around.
y = 100;
x = y; // no problem
x = 1000;
y = x; // will cause an overflow.
3. When should I use suffixes F,L and U?
I use them as very convenient typecasts for literal numbers double x = 1000f; etc.