Is it really a good idea to use TCHAR instead of char? Petzold uses it, but "Programming Windows 98 from the GROUND UP" doesn't. What do you do?
Is it really a good idea to use TCHAR instead of char? Petzold uses it, but "Programming Windows 98 from the GROUND UP" doesn't. What do you do?
1978 Silver Anniversary Corvette
Depends what you're programming for, but I can't see any reason not to. It's only an extra character to type.
Petzold explains why he uses TCHAR ie UNICODE. Using char is just fine as it's native to win9x anyway. WinNT/2k/XP uses 16 bits to represent characters ie UNICODE; it would just convert char to wchar anyway. So its simplest to use char.
If you use TCHAR you should use one of the text macros - _T or TEXT for ensuring strings are properly represented. For win9x where UNICODE is not #defined then they end up as char type. For compiling for winNT/2k/XP then you should, ideally, #define UNICODE when these macros will ensure wide character strings (wchar). In summary:
#ifdef UNICODE
#define TCHAR wchar
#else
#define TCHAR char
#endif
(or something like that)
Personally, I use TCHAR, which took a little getting used to at first.
Just #define UNICODE before #include <windows.h> for winNT/2k/XP.
Hope that helps some.
Yes, yes, I understand why it's used, but I just wanted to know if you use it. I do use it (I learned from Petzold and you use "ways" of your teacher). I had recently bought from the GROUND UP and he uses char.
Yeah, I know it will convert it to TCHAR anyway, but that's time, and us programmers don't have that...
1978 Silver Anniversary Corvette
>but that's time, and us programmers don't have that...
You had time to post the question .
Correction: Execution timeOriginally posted by Sorensen
>but that's time, and us programmers don't have that...
You had time to post the question .
1978 Silver Anniversary Corvette
And winNT/2k/xp would have to convert char to wchar internally which also takes time.
I thought all these macros and defines were handled by the preprocessor so that the hit is experienced at compile time and not at run-time?
I only ask because that's how I thought of it but am not too sure.
This is how it goes:
When you compile your code, it decides if the OS is in unicode or not. If in unicode, then it'll use charw (is this the unicode char?), if not, then it'll use just char. But, if you just use "char" in your code, compile time won't change it to charw, but during execution time it'll have to convert it. Therefore, taking up time. This is my understanding...
1978 Silver Anniversary Corvette
Interesting...
But having thought about it a bit more and remembered terms like 'macro expansion' I think i'll stick with my original view.
I think when you have #define XXX then whereever in your code that XXX appears, the whole thing is substituted, so that the compiler would substitute the result of eg the TEXT macro. I think that's why B. Stroustrup recommends fn inlining as a possible substitute for macros as the effect is more or less the same (but error detection is a lot easier with inline fns). So your code would end up with either wchar (UNICODE) strings or char (ANSI) strings.
I could be wrong, though Maybe you have a link to a reference I could read on the subject?
Or perhaps one of the pro's might be able to clarify this.
Interesting topic though...
edit: I just re-read your last post, Garfield, and realised that's what you were saying but in a different way. Sorry about that. I should go sleep now.
Last edited by Ken Fitlike; 02-15-2002 at 07:27 PM.
>> I should go sleep now. <<
LOL Nighty-night...
1978 Silver Anniversary Corvette
Use TCHAR. The compile time will make the necessary conversions. If reading in an ASCII file, then make a variable char, otherwise use TCHAR.