I know ive seen this somewhere, but i can't find it back. how do you change the binary value of a char. for instance, the binary value of 'a' + 1 to get 'b'. (I know the example isn't so useful unless you want to list the alfabet)
Printable View
I know ive seen this somewhere, but i can't find it back. how do you change the binary value of a char. for instance, the binary value of 'a' + 1 to get 'b'. (I know the example isn't so useful unless you want to list the alfabet)
the binary value is difficult to calculate - if you want I can look for a piece of code and post it - but i don't remember the code *gg*
but if you want to list the alphabet you better use the ascii code of the chars
for example
for(int i='a';i<='z';++i)
cout<<i<<"\n";
let me put it differntly, the ascii value of a charactar, for instance 'a' is 97, or 60 in hexidecimal. but how do you say, the ascii value of the character -20
There's a way you can print out the value of it with a binary form. I can do it in C, but not sure in C++. There has to be a way in C++, though.
I don't know if this is what you are looking for but if you are trying to find the ascii value of hex 0x-20 there isn't one Idon't believe ascii and hex are only postive.
Ryan
(I find that you get the most help from zen)
I believe the '-' sign shouldn't be there, right?
char a = 20; //a now contains 20
cout << a; //This will print out the character
There are no -ve ASCII characters. If you are able to view any ascii characters by, say doing a cout << (chr) -20;, it'd be compiler dependent, and would only return the normal ASCII character of the number calculated by remainer of -20 mod 256.
Try explicit conversion. THe syntax:
new_variable = (new_type)old_variable;
I've tried something like this before, and it didn't work, but this is a start, and may work (after all, I sucked at debugging when I did this the last time... Try it.)
-20 as an unsigned byte is (256-20) = 236. 236 is one of the extended characters: . Cool! It's the infinite symbol!
(EDIT) Ohh, it converted to Unicode. Crap (/EDIT)
thanks, sean reminded me.
the code was
Code:(int)'a'
how do you change the binary value of a char
looks to me that the answer you wanted was very different to the question you asked!
int x = (int)'a';
is very different to 'changing' a binary value!
U.
how would you reccomend asking the question.:rolleyes: :rolleyes: :rolleyes: :rolleyes:
I probably would have said:
"how do i store the ascii value of a char in an integer variable?"
that would have made it perfectly clear ;)
But hey, as long as you have what you want... who cares?! :)
U.