I've been wanting to ask this somehow for a while, but had trouble phrasing it as a non-math question. What follows admittedly isn't really a programming problem. It is proving something very important about Turing machines and complexity.
Presumably a lot of you are in college. It's expected that at some point, you have studied complexity and "Big O" notation. The following is pretty basic, but it is a very fundamental truth which is needed for many proofs.
In the following code, what is the average complexity of increment(char * binary_string) with respect to the string's length?
Code:#include <stdio.h> #include <assert.h> /** * Advance binary_string to the 'next' value. * * binary_string An arbitrary sting of '0' and '1'. * binary_string is N characters long. * binary_string is not all '1's (no overflow) * * What is the expected number of comparisons of this function? */ void increment (char * binary_string) { char * curr_char; // Scan until the first '0', flipping off all the '1' bits. for (curr_char = binary_string; *curr_char == '1'; ++curr_char) *curr_char = '0'; // Flip the 0 bit and return. assert (*curr_char == '0'); *curr_char = '1'; return; } int main (void) { char foo[5]; for (strcpy (foo, "0000"); strcmp (foo, "1111"); increment(foo)) printf ("%s\n", foo); printf ("%s\n", foo); return 0; }