Thread: How does a computer do base conversion (binary <-> decimal)?

Threaded View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Registered User
    Join Date
    Sep 2009
    Posts
    14

    Question How does a computer do base conversion (binary <-> decimal)?

    So, I know how to convert between bases, but I was wondering how the computer implements base conversion for I/O (I assume this is done by the OS?).

    I can think that for relatively small numbers the conversion is fairly simple, but when I give the computer 18446744073709551615 as numeric input, it knows to store it as 11111111111111111111111111111111111111111111111111 11111111111111, and then display the decimal again when asked to interpret the bits as an integer. The only algorithms I know would get really slow with such large numbers, so how is the computer doing this?

    I started to think about this because I was wondering how to implement a program that would read a binary file and print it as a huge decimal number (just for giggles, I guess).

    Thanks for any insight.
    Last edited by Muster Mark; 01-22-2011 at 04:14 PM. Reason: typos

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. I need help with decimal to binary Algorithm
    By webznz in forum C Programming
    Replies: 4
    Last Post: 03-13-2008, 03:52 AM
  2. Request for comments
    By Prelude in forum A Brief History of Cprogramming.com
    Replies: 15
    Last Post: 01-02-2004, 10:33 AM
  3. decimal to binary conversion
    By noob2c in forum C Programming
    Replies: 4
    Last Post: 05-29-2003, 08:07 PM
  4. binary to decimal
    By jk81 in forum C Programming
    Replies: 1
    Last Post: 09-13-2002, 05:20 AM
  5. Replies: 10
    Last Post: 06-12-2002, 03:15 PM