Thread: How long can a 64 bit unsigned integer take to increment from 0 to 2.04 quadrillion?

  1. #1
    Registered User
    Join Date
    Nov 2012
    Location
    Some rock floating in space...
    Posts
    32

    Question How long can a 64 bit unsigned integer take to increment from 0 to 2.04 quadrillion?

    There's a theorem called the "Infinite Monkey Theorem" which postulates that a chimpanzee banging on a typewriter long enough ( approaching infinity ) will eventually have re-created the complete works of Shakespeare.
    Infinite monkey theorem - Wikipedia, the free encyclopedia

    So, I had an idea for my own variation of this sort of theorem which goes like this.

    Any file or data which is or can be expressed in a series of bytes ( 8 bit variety ) can and will eventually be reproduced if treated as a little-endian integer ( first byte is least significant and last is most significant ) of arbitrary length using extended-precision arithmetic techniques if a second value of arbitrary precision is initialized to zero and incremented one by one and growing a byte at a time in precision as needed until the second arbitrary integer matches the first ( the data )

    In other words, a file can be thought of as a little endian integer of file size in bytes length. First byte of file is least significant and last byte is most significant. As such it represents a specific value.

    As such, if an arbitrary precision integer is initialized to zero and incremented one by one in an iterative loop, given long enough, will eventually ( given infinity ) have reproduced every bit of data on every computer in the Universe. Not only that, but it will have reproduced the Human Genome for example given enough time.

    So I wrote the simplest of programs to test my little thought experiment. I created two strings.. One 4 characters in length: "APES" and a second that is 8 characters in length: "MONKEYS!".

    The idea is that I copy each string into a 4 byte unsigned 32 bit integer and a 8 byte unsigned 64 bit integer respectively in which case they equal the following values:

    "APES" equals DEC: 1,397,051,457 HEX: 0x53455041

    "MONKEYS!" equals DEC: 2,401,361,180,469,645,133 HEX: 0x215359454b4e4f4d

    Ok. As such, In the first case of "APES", I created a second unsigned 32 bit integer which is initialized to zero and incremented by one in a loop until eventually it reproduces the 4 characters of the string "APES"... This worked perfectly and reproduced the string "APES" in 3 seconds ( I timed it )

    Ok. The second test is coded exactly the same, but in the second portion of the test, I substituted a 8 character string for the 4 character string and used 64 bit unsigned integers to store the target value and the work value which is to be incremented one by one until it matches the value of the string "MONKEYS!" when the string is treated as an integer.

    Ok. Here's the problem... The first part of the test finished in almost exactly 3 seconds on my computer with perfect results. However the second 64 bit portion of the code ran for over 4 and a half hours without results.

    So here are some of my calculations based on the results of the first 32 bit test

    "APES" : 1,397,051,457 was reached from 0 incrementing to 1.3 billion in 3 seconds
    1,397,051,457 / 3 == 465,683,819 increments per second

    Ok, with that in mind, here are my projections for the amount of time that the second 64 bit portion of my test code will take to complete... like I said, I ran the program for over 4 hours without results...

    "MONKEYS!" : 2,401,361,180,469,645,133 / 465,683,819 == 5,156,634,356 seconds to complete

    5,156,634,356 / 3600 == 1,432,398 hours to complete

    1,432,398 / 24 == 59,683 days to complete

    59,683 / 365 == 163 years to complete


    Are my calculations correct? Will the following code really take 163 years to complete?

    Any input will be appreciated!!!!

    Code:
    #include <stdio.h>
    #include <stdlib.h>
    #include <string.h>
    #include <locale.h>
    #include <time.h>
    
    typedef unsigned long DWORD;
    typedef unsigned long long QWORD;
    
    char apestr[5] = "APES";
    char monstr[9] = "MONKEYS!";
    
    DWORD apes;
    QWORD monkeys;
    
    DWORD work_dw;
    QWORD work_qw;
    
    int main(void)
    {
    
        clock_t start,stop;
        double duration;
    
        setlocale(LC_NUMERIC,"");
        memcpy(&apes,apestr,4);
        memcpy(&monkeys,monstr,8);
    
        printf("APES : DEC: %'lu HEX: %#lx\n",apes,apes);
        printf("MONKEYS! : DEC: %'llu HEX: %#llx\n",monkeys,monkeys);
    
        printf("First, let's find APES\n");
        work_dw = 0;
        start = clock();
        while(work_dw != apes) { ++work_dw;  }
        stop = clock();
        duration = (stop - start) / CLOCKS_PER_SEC;
        printf("APES found!!!!!!!!!!\n");
        printf("Found in %f seconds\n",duration);
        printf("Now let's find MONKEYS!\n");
        start = clock();
        work_qw = 0;
        while(work_qw != monkeys){ ++work_qw; }
        stop = clock();
        duration = (stop - start) / CLOCKS_PER_SEC;
        printf("MONKEYS! found!!!!!!!!!!!!\n");
        printf("Found in %f hours %f minutes\n",
            duration / 3600,
            duration / 60);
    
        return 0;
    }
    And please note, that I do know that I can make the duration of this program considerably less by using optimization flags in the compiler. However the optimization flags produce code which doesn't actually do the number of increments necessary. It simply optimizes the code and assigns the target value rather than generating the code for the loop, which is sort of like cheating..

    For example, if I compiled the code like this: gcc poc.c -o poc -O2

    It produces code which completes in a fraction of a second with perfect results save the fact that it hasn't actually done the number of increments necessary to prove my silly theory.

    WIth flags for optimization of speed - i.e. -O2 - it simply looks at the loops in my code - simple as they are - and decides it's less than optimal and replaces it with simple assignments of values rather than performing the iterative increments in my code...

    e.g.
    Code:
    while(work_dw != apes) { ++work_dw; }
    becomes
    Code:
    work_dw = apes;
    Any thoughts on how long this program will take to finish without optimizations?

    Thanks in advance

    Twiki

  2. #2
    Registered User
    Join Date
    Sep 2012
    Posts
    357
    Yes. Your calculations are correct.
    It will take 160 years (give or take a couple years) to complete the 2nd program.

    Edit
    Notice that according to Moore's Law, if you wait 2 years to start your program, it will need 80+2 years instead
    Last edited by qny; 11-14-2012 at 03:48 PM.

  3. #3
    Registered User
    Join Date
    Nov 2012
    Location
    Some rock floating in space...
    Posts
    32
    Ok. Thank you! I thought my calculations were off. I well expected that it could take hours to finish, but I never thought for a second that it would take 163 years to complete... Oh well... Thanks again!

  4. #4
    Registered User
    Join Date
    Sep 2006
    Posts
    8,868
    Quote Originally Posted by twiki View Post
    Ok. Thank you! I thought my calculations were off. I well expected that it could take hours to finish, but I never thought for a second that it would take 163 years to complete... Oh well... Thanks again!
    That's why breaking up the job into work units that can be done in parallel, is so important. It's still a huge job, but entirely manageable.

    Folding@Home, for instance, is one of the most powerful computer "centers" in the world. I'm not sure it could now, but before the recession, F@Home could stomp any supercomputer, in any well paralleled job like this one. That's the key - making these kinds of jobs into parallel jobs, and using a lot of cores!

    Brrrrrrrtttttt!

  5. #5
    Registered User
    Join Date
    May 2009
    Posts
    4,183
    Quote Originally Posted by qny View Post
    Yes. Your calculations are correct.
    It will take 160 years (give or take a couple years) to complete the 2nd program.

    Edit
    Notice that according to Moore's Law, if you wait 2 years to start your program, it will need 80+2 years instead
    80+2
    40+2+2
    20+2+2+2
    10+2+2+2+2
    5+2+2+2+2+2
    2.5+2+2+2+2+2+2

    So, if the computer speed Doubles every two years (one way of viewing Moore's Law) then you should wait 12 years and the program should only take 2.5 years to run.
    This is if my math is right; my thinking has NOT be very good lately.

    Tim S.
    "...a computer is a stupid machine with the ability to do incredibly smart things, while computer programmers are smart people with the ability to do incredibly stupid things. They are,in short, a perfect match.." Bill Bryson

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. problem while printing unsigned long long int
    By Dedalus in forum C Programming
    Replies: 3
    Last Post: 03-08-2012, 04:44 AM
  2. Replies: 1
    Last Post: 10-11-2010, 01:53 AM
  3. unsigned long long division print
    By Kempelen in forum C Programming
    Replies: 4
    Last Post: 01-30-2009, 10:03 AM
  4. unsigned long long to string conversion
    By Wiretron in forum C++ Programming
    Replies: 6
    Last Post: 12-21-2007, 04:02 AM
  5. Converting an unsigned int to to unsigned long long (64 bit)
    By ninjacookies in forum C Programming
    Replies: 18
    Last Post: 02-11-2005, 12:09 PM

Tags for this Thread