# Thread: binary to decimal

1. ## binary to decimal

ok I have an array with 8 characters composed of either 0s or 1s. for example i have an array that if you print all its contents, you will get 01110100. however the characters are assigned randomly, so the next time i run the program the array may contain 11010110

even though they are just characters they are supposed to represent a binary digit. how can I convert the binary representation into a decimal number?

so for example say the array contains 10000011, how can I convert that to 259?

i know how to convert it on paper by the way.

2. Do you mean you have something like char binary[8]? Where each char is either '1' or '0'? Then take what you have on paper and type it into your program, remembering that the first digit is binary[0] and the last is binary[7].

3. binary[0] x 2 ^ 0 + binary[0] x 2 ^ 1 and so on through all your numbers. This works for any base if you exchange the 2 for the base...for converting to base 10 that is.
example:
1010101
(1 * 2 ^ 0) + (0 * 2 ^ 1) + (1 * 2 ^ 2) + (0 * 2 ^ 3) + (1 * 2 ^ 4) + (0 * 2 ^ 5) + (1 * 2 ^ 6)
85<===

4. word i got it working

edit: thanks valaris, i did it a different way but thanks for your time.

Popular pages Recent additions