# Thread: Decimal To Binary Algorithm

1. ## Decimal To Binary Algorithm

I'll just jump right into it, this is from the challenges page on the site (not the code, but the challenge itself: Convert decimal number to binary)

I was wondering if you could give me your thoughts on this solution. Now I know that the use of the stl stack isn't the best thing in this situtation, but it just helps to display the bits in the correct order. My problem is that the time complexity of this function is O(N), and I try to stay way from O(N) time when ever possible (although its a norm for alot of algo). I was wondering if anyone has a better solution to displaying the bits (not getting the bits, but just displaying them) in less then O(N)?

Code:
```void g_DecToBinary(unsigned int dec_number) {
std::stack<unsigned int> binary_number;
unsigned int old_dec_number = dec_number,
bit;

while(dec_number != 0) {
bit = dec_number % 2;

dec_number /= 2;

if(bit)
binary_number.push(1);
else
binary_number.push(0);
}

std::cout << "The decimal number " << old_dec_number << " converted to binary is: ";

while(!binary_number.empty()) {
std::cout << binary_number.top();

binary_number.pop();
}

std::cout << std::endl;
}```

2. Easiest way to convert a num to binary is by using bitwise operator.

3. Specifically, >> (or >>= or <<) and & should work.

Search the board, this has come up many times before.

5. Interesting thread, I'd thought I post what came to mind here.

Regards,
Brian
Code:
```#include <iostream>
#include <bitset>

using std::cin;
using std::cout;
using std::bitset;

int main(int argc, char *argv[])
{
if(argc == 2)
{
int i = atoi(argv[1]);
bitset<sizeof(i)*8> b = i;
cout << b << "\n";
}
else
cout << "Usage: progName intValue\n";

cout << "Press 'Enter' to continue . . . ";
cin.sync();
cin.ignore();
return(0);
}```