Thread: Computation theory question

  1. #1
    The Earth is not flat. Clyde's Avatar
    Join Date
    Mar 2002
    Posts
    1,403

    Computation theory question

    A universal Turing machine can simulate any Turing machine. Right got that. But is it always possible to determine the exact nature of a given Turing machine?

    Suppose i hand you some kind of hardwired word processor type thing. One of those weird electronic typewriters, now pretend it wasn't programmed in any language you know (and the machine code is unknown too). All you have is the device but you have available to you all the physical information - so you can watch voltages changing across every chip in the circuit as you use the word processor

    Purely with that physical data is it feasible to work out how to simulate the machine on a computer? If it's not feasible how simple do we need to make the machine before it becomes feasible and what kind of scaling in terms of difficulty are we talking about?
    Last edited by Clyde; 06-29-2006 at 02:54 AM.
    Entia non sunt multiplicanda praeter necessitatem

  2. #2
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    What?

  3. #3
    Devil's Advocate SlyMaelstrom's Avatar
    Join Date
    May 2004
    Location
    Out of scope
    Posts
    4,079
    Bubba: If you knock out the first paragraph and shuffle the words around in the second two, it would appear he's asking if you could take the physical signals going on in the processors and memory and the boards between them, could you decipher what the computer is actually computing.

    If that is the question, I would probably have to say that with the right technology, the answer is yes. I'm no engineer, but I'd imagine if you could read the pulses properly, you could convert it to machine code and then to whatever higher level language or psudocode you wish. Of course, I'd say this is probably not a simple procedure.

    Again, though, I'm no engineer nor am I a very good programmer. :/
    Sent from my iPadŽ

  4. #4
    Registered User axon's Avatar
    Join Date
    Feb 2003
    Posts
    2,572
    god, i hated that class.

    some entropy with that sink? entropysink.com

    there are two cardinal sins from which all others spring: Impatience and Laziness. - franz kafka

  5. #5
    The Earth is not flat. Clyde's Avatar
    Join Date
    Mar 2002
    Posts
    1,403
    If you knock out the first paragraph and shuffle the words around in the second two, it would appear he's asking if you could take the physical signals going on in the processors and memory and the boards between them, could you decipher what the computer is actually computing.

    If that is the question, I would probably have to say that with the right technology, the answer is yes. I'm no engineer, but I'd imagine if you could read the pulses properly, you could convert it to machine code and then to whatever higher level language or psudocode you wish. Of course, I'd say this is probably not a simple procedure.
    Yes that was essentially my question. I would have thought it was possible in principle but it's not obvious to me whether it's possible in practice, and if it is what are the limits? In the original scenario there is just one hardwired device if that's feasible then could the same approach be applied to software - if you didn't know the machine code of a PC and you could watch all the physical signals is it feasible to suggest someone could deduce the machine code, and then use that to work out what the PC is doing?

    Intuitively you expect it to be impossibly difficult, if you imagine writing the best possible algorithm that will look at the signals and look at the output and try and work out what's going on, then i'd have thought the scaling of the algorithm would be truly horrible - so it might manage ok with a single chip fed by a single input stream and outputing some binary lamp on/lamp off output, but as the system of interest got more complex the amount of time necessary to compute would increase at some exponential rate. - e.g. double the no. of components, multiply the time taken for the algorthm to work it all out by 64.

    The problem is, since i know next to knowing about the theory underneath computers i don't know whether my guess is right or not (i hope it's not), and i want to find out!
    Entia non sunt multiplicanda praeter necessitatem

  6. #6
    Devil's Advocate SlyMaelstrom's Avatar
    Join Date
    May 2004
    Location
    Out of scope
    Posts
    4,079
    I actually looked over the part where you said you didn't know the language the system is communicating in. If that's the case, the answer is likely no. Then again, now we're looking more along the lines of cryptography, which again, I'm no specialist in.
    Sent from my iPadŽ

  7. #7
    Yes, my avatar is stolen anonytmouse's Avatar
    Join Date
    Dec 2002
    Posts
    2,544
    Suppose i hand you some kind of hardwired word processor type thing. One of those weird electronic typewriters, now pretend it wasn't programmed in any language you know (and the machine code is unknown too). All you have is the device but you have available to you all the physical information - so you can watch voltages changing across every chip in the circuit as you use the word processor

    Purely with that physical data is it feasible to work out how to simulate the machine on a computer? If it's not feasible how simple do we need to make the machine before it becomes feasible and what kind of scaling in terms of difficulty are we talking about?
    Unless you run every input combination (clearly not feasible for any program which accepts non-trivial input), you can't guarantee that you have seen every code path. Therefore, you can't guarantee a 100% complete simulation, although you may be able to get the common code paths, which may be good enough.
    In the original scenario there is just one hardwired device if that's feasible then could the same approach be applied to software - if you didn't know the machine code of a PC and you could watch all the physical signals is it feasible to suggest someone could deduce the machine code, and then use that to work out what the PC is doing?
    Yes. At the CPU level, input and output can be relatively simple. An instruction may consist of just a few words of input and ouput (operands, op-code, flags, etc). Clearly, if you can monitor the input and output it is probably feasible to reverse engineer the CPU instructions.

  8. #8
    The Earth is not flat. Clyde's Avatar
    Join Date
    Mar 2002
    Posts
    1,403
    Unless you run every input combination (clearly not feasible for any program which accepts non-trivial input), you can't guarantee that you have seen every code path. Therefore, you can't guarantee a 100% complete simulation, although you may be able to get the common code paths, which may be good enough.
    Ah ok, that makes sense.

    Does it make a difference if you can isolate bits of the hardware:

    Forget that it's a word processor and look at each chip individually, test each chip interms of inputs and outputs (which from your comment on the CPU would not seem to be unfeasible). Thus work out what each section of the circuit does then put it all together.

    If one follows that method can you reconstruct the logic of the underlying code in the machine?
    Last edited by Clyde; 06-29-2006 at 02:09 PM.
    Entia non sunt multiplicanda praeter necessitatem

  9. #9
    I am me, who else?
    Join Date
    Oct 2002
    Posts
    250
    Unless you have specific and defined behaviors for each chip I would think that is not a feasible way either. Sure you can test inputs and outputs, but in many cases I can see where a chip could have multiple inputs and outputs, which causes a huge amount of tests. Its probably more feasible than testing all non-trivial output, but to me it seems like it would still be too hard to get the base logic of the machine.

    This is a thought only however, its just a feeling that, that particular method would not work too well.

  10. #10
    Registered User
    Join Date
    May 2006
    Location
    Troy
    Posts
    14
    It's impossible, even if you isolate bits of hardware. You don't know whether a chip retains information from what was typed in 30 days ago, or 2000000 years ago, and there's no way you can guarantee that it doesn't. Perhaps one chip says that every Nth keypress, where N = 5 trillion, it shifts the letter value by 1. You could not prove that there is no value of N for which the chip does this.

  11. #11
    The Earth is not flat. Clyde's Avatar
    Join Date
    Mar 2002
    Posts
    1,403
    It's impossible, even if you isolate bits of hardware. You don't know whether a chip retains information from what was typed in 30 days ago, or 2000000 years ago, and there's no way you can guarantee that it doesn't. Perhaps one chip says that every Nth keypress, where N = 5 trillion, it shifts the letter value by 1. You could not prove that there is no value of N for which the chip does this.
    But if i can look inside the chip (say with an electron microscope) if i can see the physical structure of the gates (there are gates in chips right?) then can i work out what any chip is going in a reasonable amount of time? Or is it still impossible?
    Entia non sunt multiplicanda praeter necessitatem

  12. #12
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Ireland
    Posts
    8,446
    If the technology is right I think it's common sense to believe that it's always possible to determine the whole from the parts. No matter how nuclear the parts may be.
    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

  13. #13
    The Earth is not flat. Clyde's Avatar
    Join Date
    Mar 2002
    Posts
    1,403
    If the technology is right I think it's common sense to believe that it's always possible to determine the whole from the parts. No matter how nuclear the parts may be.
    So armed with a good enough microscope and a complete knowledge of the computational theory and the relevent physics but without knowing any of the syntax of any languages, you could reconstruct the logic of say windows XP if i gave you a PC with it installed on (assuming your microscope is able to measure without destroying) within say 100 years. Feasible, or stupendously ridiculous?
    Entia non sunt multiplicanda praeter necessitatem

  14. #14
    Devil's Advocate SlyMaelstrom's Avatar
    Join Date
    May 2004
    Location
    Out of scope
    Posts
    4,079
    Why do I feel like for the past week and a half Clyde has been sitting in front of a homework paper with a check box for "Yes" and one for "No" waiting for a straight answer?
    Sent from my iPadŽ

  15. #15
    (?<!re)tired Mario F.'s Avatar
    Join Date
    May 2006
    Location
    Ireland
    Posts
    8,446
    Clyde, Turing machines are a theory largely based on imagined scenarios. It's probably the closest thing we have to "computational psychology" (if such a thing existed).

    So the answer is... anything is possible.
    And mainly this is so because... turing machines don't exist.

    In an imagined scenario where turing machines existed, of course the answer is a sound 'yes'. What else?
    Originally Posted by brewbuck:
    Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. another do while question
    By kbpsu in forum C++ Programming
    Replies: 3
    Last Post: 03-23-2009, 12:14 PM
  2. Question...
    By TechWins in forum A Brief History of Cprogramming.com
    Replies: 16
    Last Post: 07-28-2003, 09:47 PM
  3. opengl DC question
    By SAMSAM in forum Game Programming
    Replies: 6
    Last Post: 02-26-2003, 09:22 PM
  4. Theory question...
    By Imperito in forum C++ Programming
    Replies: 1
    Last Post: 01-01-2003, 05:58 AM
  5. Question, question!
    By oskilian in forum A Brief History of Cprogramming.com
    Replies: 5
    Last Post: 12-24-2001, 01:47 AM