What should I use to convert character variables into integer variables? I need to do this for a thing I'm making. Any help will be appreciated.
What should I use to convert character variables into integer variables? I need to do this for a thing I'm making. Any help will be appreciated.
Child who knows C++Using Borland C/C++ Compiler 5.5 (Command Line Version)
Care to expand on why this is not working for you?atoi not converting chars to ints?
For some real fun you could make a function that does this your self by counting decimal places, multiplying that number ten, and then multiplying that by the number in the character variable (use the ASCII values to convert to real numbers). Just add all those numbers up and you'll get the answer.
Or if you're referring to converting the binary number stored in a char variable to an integer, just use typecasting.
Why is atoi() not working for you? You could also try strtol().
I assume your problem is that you have a single char instead of a string of chars, is that right? If so, a quick and dirty solution is to subtract '0' from your char, the result being the int.
Last edited by jlou; 10-07-2004 at 09:41 PM. Reason: typo
If that's the case, do typecasting. That's exactly what it was designed for and makes for more understandable code.
do you even need to convert char to ints? Sometimes char can be used like ints. For example the following compiles with output as indicated using MSVC 6.0:
Code:char ch = 'A'; cout << ch + 10 << endl; for(int i = 0; i < 3; ++i) cout << ch++ << endl; output: 75 A B C
If you were referring to my post, then that wouldn't work. A typecast would merely give the character's ASCII value, but subtracting '0' would convert it to the number it represents (assuming the character is a digit).Originally Posted by sean_mackrory
Oh, yeah?
Example program (Not the program that this is going to be used in)
Output:Code:int main() { char digit = '3'; int dgt = digit-0; cout<<dgt<<endl<<flush; cin.get(); return 0; }
What I want to do is make the integer 3. Atoi does this only for strings and not characters. Hmm... Sean, would you mind explaining your first post a bit more?Code:51
Child who knows C++Using Borland C/C++ Compiler 5.5 (Command Line Version)
try
digit - '0'
rather than
digit - 0
Thanks! It worked and now I'm happy! Thy program I shalt make shalt work and shalt work fluently. LOL, old english.
Child who knows C++Using Borland C/C++ Compiler 5.5 (Command Line Version)
I know what your getting at there, essentially, your subtraction the value of the char '0'.Originally Posted by elad
yup, just demostrating what jlou had indicated in an earlier post.
Ahh.... subtracting the character '0'. I see. Yeah that makes sense. I thought you were referring to subtracting 0 from the char since typecasting would occur automatically in the background, and placing the resulting integer value into the destination variable.
> Ahh.... subtracting the character '0'.
Yeah, it looks like I confused several people with that point. I'll be more clear next time.
> Oh, yeah?
yeah!