To use TCHAR or not to...

This is a discussion on To use TCHAR or not to... within the Windows Programming forums, part of the Platform Specific Boards category; Is it really a good idea to use TCHAR instead of char? Petzold uses it, but "Programming Windows 98 from ...

  1. #1
    the Corvetter
    Join Date
    Sep 2001
    Posts
    1,584

    To use TCHAR or not to...

    Is it really a good idea to use TCHAR instead of char? Petzold uses it, but "Programming Windows 98 from the GROUND UP" doesn't. What do you do?
    1978 Silver Anniversary Corvette

  2. #2
    S­énior Member
    Join Date
    Jan 2002
    Posts
    982
    Depends what you're programming for, but I can't see any reason not to. It's only an extra character to type.

  3. #3
    erstwhile
    Join Date
    Jan 2002
    Posts
    2,227
    Petzold explains why he uses TCHAR ie UNICODE. Using char is just fine as it's native to win9x anyway. WinNT/2k/XP uses 16 bits to represent characters ie UNICODE; it would just convert char to wchar anyway. So its simplest to use char.

    If you use TCHAR you should use one of the text macros - _T or TEXT for ensuring strings are properly represented. For win9x where UNICODE is not #defined then they end up as char type. For compiling for winNT/2k/XP then you should, ideally, #define UNICODE when these macros will ensure wide character strings (wchar). In summary:

    #ifdef UNICODE
    #define TCHAR wchar
    #else
    #define TCHAR char
    #endif

    (or something like that)

    Personally, I use TCHAR, which took a little getting used to at first.
    Just #define UNICODE before #include <windows.h> for winNT/2k/XP.

    Hope that helps some.

  4. #4
    the Corvetter
    Join Date
    Sep 2001
    Posts
    1,584
    Yes, yes, I understand why it's used, but I just wanted to know if you use it. I do use it (I learned from Petzold and you use "ways" of your teacher). I had recently bought from the GROUND UP and he uses char.

    Yeah, I know it will convert it to TCHAR anyway, but that's time, and us programmers don't have that...
    1978 Silver Anniversary Corvette

  5. #5
    S­énior Member
    Join Date
    Jan 2002
    Posts
    982
    >but that's time, and us programmers don't have that...

    You had time to post the question .

  6. #6
    the Corvetter
    Join Date
    Sep 2001
    Posts
    1,584
    Originally posted by Sorensen
    >but that's time, and us programmers don't have that...

    You had time to post the question .
    Correction: Execution time
    1978 Silver Anniversary Corvette

  7. #7
    erstwhile
    Join Date
    Jan 2002
    Posts
    2,227
    And winNT/2k/xp would have to convert char to wchar internally which also takes time.

    I thought all these macros and defines were handled by the preprocessor so that the hit is experienced at compile time and not at run-time?

    I only ask because that's how I thought of it but am not too sure.

  8. #8
    the Corvetter
    Join Date
    Sep 2001
    Posts
    1,584
    This is how it goes:

    When you compile your code, it decides if the OS is in unicode or not. If in unicode, then it'll use charw (is this the unicode char?), if not, then it'll use just char. But, if you just use "char" in your code, compile time won't change it to charw, but during execution time it'll have to convert it. Therefore, taking up time. This is my understanding...
    1978 Silver Anniversary Corvette

  9. #9
    erstwhile
    Join Date
    Jan 2002
    Posts
    2,227
    Interesting...

    But having thought about it a bit more and remembered terms like 'macro expansion' I think i'll stick with my original view.

    I think when you have #define XXX then whereever in your code that XXX appears, the whole thing is substituted, so that the compiler would substitute the result of eg the TEXT macro. I think that's why B. Stroustrup recommends fn inlining as a possible substitute for macros as the effect is more or less the same (but error detection is a lot easier with inline fns). So your code would end up with either wchar (UNICODE) strings or char (ANSI) strings.

    I could be wrong, though Maybe you have a link to a reference I could read on the subject?

    Or perhaps one of the pro's might be able to clarify this.

    Interesting topic though...

    edit: I just re-read your last post, Garfield, and realised that's what you were saying but in a different way. Sorry about that. I should go sleep now.
    Last edited by Ken Fitlike; 02-15-2002 at 07:27 PM.

  10. #10
    the Corvetter
    Join Date
    Sep 2001
    Posts
    1,584
    >> I should go sleep now. <<

    LOL Nighty-night...
    1978 Silver Anniversary Corvette

  11. #11
    Registered User Italia's Avatar
    Join Date
    Feb 2002
    Posts
    13
    Use TCHAR. The compile time will make the necessary conversions. If reading in an ASCII file, then make a variable char, otherwise use TCHAR.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. My progress.. thnx to Cprogramming forumites
    By csonx_p in forum Windows Programming
    Replies: 6
    Last Post: 05-21-2008, 02:17 AM
  2. Avoiding Global variables
    By csonx_p in forum Windows Programming
    Replies: 32
    Last Post: 05-19-2008, 01:17 AM
  3. String types and conversions (TCHAR)
    By csonx_p in forum Windows Programming
    Replies: 3
    Last Post: 05-07-2008, 03:33 AM
  4. How to use FTP?
    By maxorator in forum C++ Programming
    Replies: 8
    Last Post: 11-04-2005, 03:17 PM
  5. Tchar and %s %S
    By MiamiCuse in forum C Programming
    Replies: 1
    Last Post: 10-22-2005, 10:54 PM

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21