# Thread: Trouble with program converting decimal to binary

1. ## Trouble with program converting decimal to binary

Hi everyone!

I've just started programming, and I've been trying to write a program that converts a decimal number input by the user to a binary number.

2. Ok, stop and think ... how are numbers stored in the computer? Yep, in binary as groups of high and low voltages that we interpret as 1s and 0s....
So an integer is just a group of 1s and 0s underneath all that nifty decimalness we use to display it...

So lets think process for a minute...
Is there some way to know how many bits are in an int?
Is there some way to know if the least significant bit of a binary number is a 1 or a 0?
Is there some way to shift each bit over one place so that each bit in turn appears as the least significant bit?

(Hint: It's got nothing to do with exponents or powers and it can be done without bitwise operators...)

Honest this is beyond simple...
I have a working version here... once we get yours working, I'll show you mine...

3. Originally Posted by simplyxsweet
Below is the code and the errors I've been getting. Thank you!