PDA

View Full Version : Interesting Question



Swordsman
06-28-2007, 11:21 PM
Hi all,

I've got a question which I didn't know how to answer, and where else to post it.

I program pretty much everyday and recently, my 5 year-old nephew has been sitting with me and asking questions (pretty simple ones really, but it's great that he's interested at such a young age!).

Yesterday as I was writing a very simple addition program for him to think about, he asked a question which I couldn't answer, so I thought I could ask the guys here if they can think of an answer that I could tell him.

I said that the computer takes the two variables and adds them together, so he asked "So, how do you know that the computer is right then?"

Can anyone think of a way in which I could explain to him without going in to too much detail, or giving the answer "it just is" please?

Thanks for the help,

SM

MacGyver
06-29-2007, 12:15 AM
The reliability of the computer has been tested by many other people, and made in such a way as to (almost) always be right. I would stress the fact that the computer is made as a machine that listens to instructions, and has been made to specification so that it always works when built properly, just like any other machine he uses. Once people that make computers are sure it's right, then they sell them.

Naturally, I'd simplify it a little bit more, but the concept of the computer being built to spec as a machine is important since he's smart to ask the question about reliability. If the computer isn't consistent, there is no reason to use it as we do.

Salem
06-29-2007, 12:33 AM
> so he asked "So, how do you know that the computer is right then?"
Perhaps by showing that the computer isn't right all the time, and the programmer has a big part to play in getting the "right" answer.



#include <stdlib.h>
int main ( void ) {
unsigned char u1, u2, u3;
signed char s1, s2, s3;
u1 = u2 = 100;
u3 = u1 + u2;
s1 = s2 = 100;
s3 = s1 + s2;
printf("%d\n", u3 );
printf("%d\n", s3 );
return 0;
}

$ gcc foo.c
$ ./a.exe
200
-56

Rashakil Fol
06-29-2007, 01:48 AM
Tell him there are thousands of miniature Indians inside the computer, and that when the computer needs to add numbers together, the Indians add them up on their fingers. They can count really, really fast, and you know they're never wrong, because they're Indians.

At least this is what my dad told me...

VirtualAce
06-29-2007, 02:21 AM
There is not a simple way to explain why the computer is right. The actual reason the computer can calculate values is actually quite complex and involves voltages and circuits.

http://www.flexbeta.net/main/articles.php?action=show&showarticle=15


So any voltage over a set amount in a computer system will results in a 1 and any voltage below that will result in a 0. Transistors are the main component that represent the one's and zero's since they are either on or off. Transistors come in many flavors but that is not critical to the discussion here. We know an unsigned char as 8 bits which comprises what we know as the byte. So 1 byte really resolves to 8 separate voltages being interpreted high or low to create 1's or 0's. These 1's and 0's or highs and lows can have other operations performed on them as well. Once this has been resolved the computer then interprets the final value into a character we can see on screen. Even numbers are characters as it relates to this discussion but they are merely for the end user to understand what is going on and not vital to the operation of the underlying circuits. If the computer did not interpret the final values into characters we can see and understand, the calculations and operations would still work but we would have know way of knowing they did without having some type of measuring device to determine the output.

Now how does the computer know how we want to interpret the data? How does it know that we want the value 65 instead of a capital A? That is accomplished through data types. This is why when you open an EXE file inside of Notepad you get nothing but gibberish. Notepad is expecting to interpret the values into characters but what it is reading are actually hexadecimal opcode values that resolve into machine language instructions. Since Notepad is not a disassembler, it uses the hexadecimal value as a lookup index into the character table or into the Windows character table. A disassembler would know how to interpret the opcodes into the english-like mnuemonics like mov rep add, etc. A disassembler would also know based on the platform and CPU in the system how to correctly interpret which values are literal and which ones are opcodes. The disassembler programmer gets this information right from the Intel or AMD tech refs which explain in great detail how the instruction opcodes work, what to expect when and where, and how to correctly display the result on screen.

You have probably experienced this when your printf went nutso and started printing randomly from memory due to a wild pointer or a lack of a null terminator. The only way printf() knows how to interpret the data is by the tokens and flags you use in the printf() format string. If your data type does not match the format you specified, you get gibberish because it is not translated correctly for the given data type.

But in the end it comes down to voltages of high's and low's on transistors in the CPU. The actual instruction implementation of add, subtract, multiply, divide, etc in the CPU is quite complex and gets heavily into electrical engineering which is not my forte.

Happy_Reaper
06-29-2007, 07:32 AM
I'd go with Salem's answer.

CornedBee
06-29-2007, 08:03 AM
I'd go with MacGyver's answer. The computer is a machine. It does the things it does in the same way every time. People have tried out the computer and have checked that it is right when adding numbers, so you know it will always be right when adding.

Salem's answer involves integer ranges, and I think that's too complex. Besides, it doesn't answer the question.

manofsteel972
06-29-2007, 09:01 AM
Computers do break down. Usually when they break nothing works. Anything will eventually wear out or break. I always thought that the computer was built to catch any errors that might happen for some odd reason or another. When we write code we often handle such exceptions. It is just with programs, we can usually handle it gracefully and the computer doesn't start to smoke. With hardware your computer just beeps unless it is a fatal exception and then your computer will just sit there with a blank screen until you buy a new one or fix it. Usually protective circuits are built in to chips.

If you google for "when computer chips fail", you will get a lot of interesting articles about what happens when they do.
http://www.thefreelibrary.com/Sick+chips&#37;3B+as+computer+chips+get+smaller,+they+ become+increasingly...-a04030992

If you go further you might get stuff about the old TV show. Whatever happend to Poncho?

I would just tell him that the computer is right as long as he can verify the answer the computer provided is correct. He will be a math wiz trying to prove the computer wrong.

[edit]
actually that might not be the best idea since we already have built in rounding errors due to the limitations of the data representation

@nthony
06-30-2007, 03:23 AM
They can count really, really fast, and you know they're never wrong, because they're Indians.Well... that depends, sometimes the Big Indians have fights with the smaller ones and then all hell breaks loose.

Salem
06-30-2007, 05:49 AM
01 00 00 00 is little indian format
00 00 00 01 is big indian format :D

Happy_Reaper
06-30-2007, 09:43 AM
Salem's answer involves integer ranges, and I think that's too complex. Besides, it doesn't answer the question.

I was never so much for the coding part, but rather the fact that he mentions that you can't be sure that the computer adds correctly, and therefore, in essence, the programmer has to convince himself that the output is expected.

Even with an operation as trivial as addition, the computer is no replacement for the human brain. Only the programmer can know what is right and what is wrong.

indigo0086
06-30-2007, 09:53 AM
01 00 00 00 is little indian format
00 00 00 01 is big indian format :D

They seem so far away from one another.

CornedBee
06-30-2007, 10:58 AM
you can't be sure that the computer adds correctly, and therefore, in essence, the programmer has to convince himself that the output is expected.

But that's misleading. I think the original question was really whether the computer could make basic mistakes like adding 1 and 1 and getting 3. It was a 5-year-old asking the question.

The programmer can do only so much. He can check that no overflow will occur, but for the actual calculation he has to trust the CPU. What will he do? assert() that a+b == a+b?

Happy_Reaper
06-30-2007, 11:37 AM
To be honest I'd frankly tell him to try it and see if the computer gets it wrong. And if that doesn't convince him, to tell him to keep trying until he either finds the computer making a mistake, or is convinced it won't make a mistake.

Rashakil Fol
06-30-2007, 12:28 PM
I said that the computer takes the two variables and adds them together, so he asked "So, how do you know that the computer is right then?"

Can anyone think of a way in which I could explain to him without going in to too much detail, or giving the answer "it just is" please?

Tell him you don't know it's right. See Ken Thompson's Turing Award lecture.

MacGyver
06-30-2007, 02:38 PM
I know there is a stereotype that programmers are not social and hence are unlikely to marry, and even less likely to have families. While I do not say the broad brush should be painted on us all, I must say it strikes me that some of the people posting here apparently do not remember what it was like to be a child and possibly have not dealt with young children for quite some time. :)

As CornedBee and I have been trying to point out, we believe the boy's question is one of reliability and consistency. Can the computer make mistakes? I would say, as a machine, it doesn't make mistakes when working because it's been designed to always be correct in a mathematical sense; If it does make mistakes in math, it's broken. His next question will probably be, "How do you know if it's broken?" Then you can get into aspects of hardware failure and the like, or explain that if the computer was sufficiently broken to render math calculations like simple addition to be incorrect, it is likely that many other things would not work and they would be noticable.

Remember, he's only 5. He's barely had time to use a computer in his short life, and I imagine he's curious how it is his uncle will trust this computer, which appears to have human-like abilities of doing "smart" work of adding, something which he himself as a boy in school is currently going through. He's wondering, if he and his uncle, or whoever else, can make mistakes, and yet they are smart enough to add, what makes the computer anymore reliable than a human?

The answer is that it is a machine.

At first, I was very against Salem's answer because I thought it would attempt to press too much information on him at once. I'm still against it in that regard, but should the computer ever give you the wrong output due to a bug of yours (more likely than the computer actually failing imo), then you could easily explain the nature of GIGO and how the computer, as a very obedient machine, will listen to even incorrect or silly instructions.

To tell the kid that you don't know if the answer is right is silly. He'll take your answer at face value instead of grasping a higher meaning, and conclude the computer either makes random guesses, or cannot be relied upon because it can't think clearly for some reason. At the level we rely on computers, I think their math abilities are quite consistent and something we can -- actually, we now have to -- rely on.

VirtualAce
06-30-2007, 05:25 PM
Even with an operation as trivial as addition, the computer is no replacement for the human brain. Only the programmer can know what is right and what is wrong.


The 'math' portion of the computer is far superior to the human brain. The 'implementation' of the math to various situations is most certainly something for the human brain. But any computer, any chip, any circuit can do simple operations and thus complex operations (by combining simple operations) on values. This is proven fact.

So the computer does NOT make mistake when doing simple operations. In fact the computer never makes a mistake short of the power not being consistent to the CPU and other variations in voltage which would cause issues. But when all is well, the CPU is never wrong.

Now if you are a programmer and you try to add 40 to 255 and are using an unsigned byte.....well it is you who are wrong, not the computer. The calcuation always works and always does work. As was said in the Titanic movie, 'tis a mathematical certainty.'

Thantos
06-30-2007, 06:17 PM
Transistors are the main component that represent the one's and zero's since they are either on or off.
Not completely true. If you supply just the right input voltage to a transistor you can get it to sit at 2.5V. Transistors used in digital logic are designed to have a very steep transition slope so it doesn't spend a lot of time in the transitition region. This is also why you have setup and hold times for gates.


So the computer does NOT make mistake when doing simple operations. In fact the computer never makes a mistake short of the power not being consistent to the CPU and other variations in voltage which would cause issues. But when all is well, the CPU is never wrong.
Again not completely true. A poorly designed circuit could have some hazards in it. And of course there is the ever popular faults in the circuit. And as an example: Remember the floating point number issue with Pentium 2s?

Edit: Oh to the topic at hand: There is a reason why a lot of work is done in validating the entire system (software and hardware).

VirtualAce
06-30-2007, 09:15 PM
And all of those examples are examples of human error, not circuit error. In essence if the circuit is designed correctly and used correctly, it should always have the same output.

Yes you can cause a transistor to sit at a voltage but this was a discussion of transistors as it relates to digital circuits..hence you would not use a transistor that could stay in this state for very long. And this is essentially what you have already pointed out so we are saying the same thing.

Thantos
06-30-2007, 10:09 PM
And all of those examples are examples of human error, not circuit error
But the circuits are the computer. So if the circuits are wrong then the computer is wrong. And faults are not always human generated but can occur during manufactoring. Faults can also occur during the usage life of a chip. Imagine what can happen if a 5 input nand gate burns one of its inputs and causes it to fault low.

The theory and math that the computer is based on won't be wrong, but the computer itself can be wrong.

Swordsman
07-09-2007, 06:28 AM
Thanks for the help everybody. For those who were interested, I went with a combination of Salem's and Bubba's answers.

The easiest way of explaining to him was to first tell him why computers are always correct, then to show him that they aren't. He actually understood the concept behind what he was being shown which was realy nice.

He seems to like the fact that his computer games are made out of 'hard maths' and he is still sitting with me and asking hard questions.

I want to try and further his development if programming looks like something he will be very interested in, anyone got any good suggestions please?
Learning languages or reading programming books is a bit beyond his comprehension at the moment, so I'm wondering what else I can do.

CornedBee
07-09-2007, 06:49 AM
Let him become familiar with using the computer, of course.

Also, look for logic and numeric puzzles for him to solve. The problem solving skills involved are very useful in programming.

VirtualAce
07-09-2007, 11:46 PM
Learning languages or reading programming books is a bit beyond his comprehension at the moment, so I'm wondering what else I can do.


Why don't you let him decide that instead of deciding it for him. There is nothing wrong with presenting a challenge even if it's way over his head. If he doesn't respond or begins to get disinterested, which I doubt, then lower the difficulty a bit. Most programming books start out so slow that a 5 year old could follow them. I started programming at age 5 with GW BASIC and those early years really helped me both in school and in later years when I moved on to other languages. Heck I was disassembling DOS 2.10 by age 6 using the good old DEBUG from DOS.

It is my opinion that our society does not place much of an emphasis on challenges and we tend to underestimate the age at which children can comprehend complex concepts. Hence we end up with the brain dead society we have which is still teaching how to average 5 numbers in a senior high school class. Then moving on to college you program something that can spit out the average grade in a class - whoopee. That is 100&#37; lame. Students should be way beyond averages and simple print this or that programs by high school. By college they should be so far beyond stupid programs that average grades that it's just ridiculous to assign something as simple and stupid as that.

Challenge him and you may be surprised at what you get. This kid sounds like the kind of person that Microsoft or other computer companies would love to have. Always asking why this or that works and not being satisfied with the typical answers. Sounds like a very very smart kid. Just steer him the right way so he doesn't get interested in hacking or other activities that do nothing for computer science.

And just think if people started much earlier into complex math....they could save a helluva lot of wasted money on stupid classes in the long term.

brewbuck
07-10-2007, 10:50 AM
I started programming at age 5 with GW BASIC and those early years really helped me both in school and in later years when I moved on to other languages. Heck I was disassembling DOS 2.10 by age 6 using the good old DEBUG from DOS.

It sounds like a brag, but there ARE those of us out there who started that early. I started writing C64 code when I was 5, and actually ended up getting one of my programs published in a hobby magazine by the time I was 8.

I do NOT believe this is because I'm a genius. I write absolutely stupid code sometimes just like anybody else. I believe it happened because my parents (Mom in particular) knew it was within the realm of possibility and encouraged me to do it.

I have seen a 2 year old sit down at a computer and quickly navigate through multiple levels of web pages to get to a flash game she wanted to play. One step involved agreeing to an EULA and clicking through -- this step didn't even slow her down. This is a person who can't even read yet.

Children, before getting completely screwed up by society, are little geniuses. Treat him like he understands everything you're saying and count on him to ask questions if he doesn't get something. He might blow you away.

CornedBee
07-10-2007, 11:19 AM
I have seen a 2 year old sit down at a computer and quickly navigate through multiple levels of web pages to get to a flash game she wanted to play. One step involved agreeing to an EULA and clicking through -- this step didn't even slow her down.

Yarr, they've got her now!