-
You can use the bitset member to_ulong to make the conversion.
Code:
#include <iostream>
#include <string>
#include <vector>
#include <bitset>
using namespace std;
int main()
{
string pizza("CHEESE");
vector < bitset<8> > b;
for (int i=0; i<pizza.length(); i++)
b.push_back(pizza[i]);
for (int i=0; i<b.size(); i++)
cout << b[i] << endl;
string pizza_name;
for (int i=0; i<b.size(); i++)
{
char c = b[i].to_ulong();
//cout << (int) c << endl;
pizza_name = pizza_name + c;
}
cout << pizza_name << endl;
}
-
My knowledge of C++ only goes back a single day (that's how long I've been doing C++, ouch), so don't blame me if I'm wrong, but can you typecast or use a pointer?
-
Well I said I can not type cast in about 3 or so other posts, gives me an error. And now that I have a string...... Game Maker gives me a stupid error. One problem after the other.
Also if I just return it as (float) b.to_ulong it gives me the acsii code for the letter. Why is it giving me the ascii code and not binary?
-
And I think an LPSTR is basically a char array, once you have a string, then you can store that into a char array:
Code:
char carray[100];
strcpy( carray, pizza_name.c_str() );
Or if your LPSTR already has memory allocated for it, then use strcpy() to copy to the LPSTR.
EDIT: added c_str()
-
>(float) b.to_ulong
You don't want float, you want char:
(char) b.to_ulong()
That will give you one char, which you could store straight to a char array, if you wish.
-
Allright I think all problems on the C++ side are fixed so the rest is up to me now.
Thanks alot!
-
Code:
#include <bitset>
#include <iostream>
#include <string>
#include <sstream>
using namespace std;
int main()
{
string binary;
stringstream ss;
bitset<8> b('a');
ss << b;
binary = ss.str();
cout << binary << endl;
cin.get();
return 0;
}
is this the kind of string you wanted, the string of 1's and 0's?