john.c is right and it is easy to demonstrate:
- Your UCHAR_MAX symbol could be defined as:
Code:
#define __UCHAR_MAX__ ((unsigned char)-1)
ORing with 0 can be discarded by the optimizer.
- You can force, by casting, __SCHAR_MAX__ and __SCHAR_MIN__ to by signed (no ORing needed)...
- I believe you can't implement a macro to get the size of a char in bits, like CHAR_BITS, but you can do programmatically.
- the sizeof '\1' is always the same as sizeof(int) -- In fact, 6.4.4 §10 of ISO 9989:1999 says explicitly: "An integer character constant has type int..."
Code:
$ gcc -xc -include stdio.h - <<< 'void main(){printf("%zu\n",sizeof('\1')); }'; ./a.out
4
The wchat_t type is usually the same size as int type as well:
Code:
$ gcc -xc -include stdio.h -include wchar.h - \
<<< 'void main(){printf("%zu\n",sizeof(wchar_t)); }'; ./a.out
4
But you can make it shorter with GCC option -fshort-wchar (good compilers will allow this as well):
Code:
$ gcc -xc -include stdio.h -include wchar.h -fshort-wchar - \
<<< 'void main(){printf("%zu\n",sizeof(wchar_t)); }'; ./a.out
2