Yea.. me2.. im gonna stay with ASCII
Unicode
ASCII
Yea.. me2.. im gonna stay with ASCII
what does signature stand for?
You're not staying with ASCII, your strings are being converted to unicode internally by the OS. This is overhead, this degrades performance. If you're on and NT based Operating System, you need to use Unicode!
If its better then i will learn to use it. Is there a way to take code you already have and convert it outside of the compiler? Probably not but more then 90% of the tutorials i have seen don't go by uni.
Unless you're writing code using the int values of chars, or need other characters for something, and you're coding in English, does it matter? Maybe there's a hit on NTFS, but is it big enough to worry about? I don't know, but this would be a minor consideration to me, since I'm not coding in chinese or urdu.
We have a couple Russian programmers at work and they have keyboards with equivalent Russian symbols underneath the standard US keyboard symbols.
Truth is a malleable commodity - Dick Cheney
I just did a little program using ASCII and ran it on Win2k, everything was fine...Originally posted by Eibro
You're not staying with ASCII, your strings are being converted to unicode internally by the OS. This is overhead, this degrades performance. If you're on and NT based Operating System, you need to use Unicode!
what does signature stand for?
Of course it was fine!Originally posted by Ruski
I just did a little program using ASCII and ran it on Win2k, everything was fine...
But win2k converted your ASCII chars into UNICODE chars.
UNICODE is the future. If you write professional software, you have no choice. If you are just a hobby programmer, you can still use ASCII (and let the OS do the converting-stuff).
But it is a minor performance benefit to use UNICODE. SO why not using it?? There are generic types and functions so you just compile the program twice - once with UNICODE disabled (for older windows systems) and once with UNICODE enabled (for newer systems).
Read the UNICODE chapter of Petzold - it will give you all information you need!
Hope you don't mind my bad english, I'm Austrian!
But if ASCII is converted on win2k & xp.. why not use ASCII?
what does signature stand for?
>>But if ASCII is converted on win2k & xp.. why not use ASCII?
:: Clenches fist...grits teeth ::
Try reading what's above....
Win95/98/ME use ASCII....ok...
Win NT/2k/XP still allow the use of ASCII (to ensure compatibilty), but they then convert each char into a UNICODE WCHAR (wide char - 16 bit word) for their inner workings....so when you use ASCII on these platforms, there's a little bit of overhead for the conversion....
You can use a UNICODE build of your app on a Win NT/2k/XP to get rid of that overhead, but then they wont work very well on 95/98/ME as they expect ASCII......
And if you use any of the ANSI systems you may want to take a look at the microsoft layer for UNICODE, which permits (allegedly) you to compile UNICODE apps for win9x etc systems; presumably because the 'layer' does an internal conversion to ANSI.
As an example of getting used to UNICODE consider gdiplus, the apparent replacement for gdi. It comes as standard with winxp and can be downloaded from ms for win9x/2k. It uses UNICODE exclusively so if you want to use it with win9x you have to do a lot of fiddling around to convert ANSI to UNICODE strings.