Decimal To Binary Algorithm
I'll just jump right into it, this is from the challenges page on the site (not the code, but the challenge itself: Convert decimal number to binary)
I was wondering if you could give me your thoughts on this solution. Now I know that the use of the stl stack isn't the best thing in this situtation, but it just helps to display the bits in the correct order. My problem is that the time complexity of this function is O(N), and I try to stay way from O(N) time when ever possible (although its a norm for alot of algo). I was wondering if anyone has a better solution to displaying the bits (not getting the bits, but just displaying them) in less then O(N)?
Code:
void g_DecToBinary(unsigned int dec_number) {
std::stack<unsigned int> binary_number;
unsigned int old_dec_number = dec_number,
bit;
while(dec_number != 0) {
bit = dec_number % 2;
dec_number /= 2;
if(bit)
binary_number.push(1);
else
binary_number.push(0);
}
std::cout << "The decimal number " << old_dec_number << " converted to binary is: ";
while(!binary_number.empty()) {
std::cout << binary_number.top();
binary_number.pop();
}
std::cout << std::endl;
}