Thread: Interesting Question

  1. #16
    Deathray Engineer MacGyver's Avatar
    Join Date
    Mar 2007
    Posts
    3,210
    I know there is a stereotype that programmers are not social and hence are unlikely to marry, and even less likely to have families. While I do not say the broad brush should be painted on us all, I must say it strikes me that some of the people posting here apparently do not remember what it was like to be a child and possibly have not dealt with young children for quite some time.

    As CornedBee and I have been trying to point out, we believe the boy's question is one of reliability and consistency. Can the computer make mistakes? I would say, as a machine, it doesn't make mistakes when working because it's been designed to always be correct in a mathematical sense; If it does make mistakes in math, it's broken. His next question will probably be, "How do you know if it's broken?" Then you can get into aspects of hardware failure and the like, or explain that if the computer was sufficiently broken to render math calculations like simple addition to be incorrect, it is likely that many other things would not work and they would be noticable.

    Remember, he's only 5. He's barely had time to use a computer in his short life, and I imagine he's curious how it is his uncle will trust this computer, which appears to have human-like abilities of doing "smart" work of adding, something which he himself as a boy in school is currently going through. He's wondering, if he and his uncle, or whoever else, can make mistakes, and yet they are smart enough to add, what makes the computer anymore reliable than a human?

    The answer is that it is a machine.

    At first, I was very against Salem's answer because I thought it would attempt to press too much information on him at once. I'm still against it in that regard, but should the computer ever give you the wrong output due to a bug of yours (more likely than the computer actually failing imo), then you could easily explain the nature of GIGO and how the computer, as a very obedient machine, will listen to even incorrect or silly instructions.

    To tell the kid that you don't know if the answer is right is silly. He'll take your answer at face value instead of grasping a higher meaning, and conclude the computer either makes random guesses, or cannot be relied upon because it can't think clearly for some reason. At the level we rely on computers, I think their math abilities are quite consistent and something we can -- actually, we now have to -- rely on.

  2. #17
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    Even with an operation as trivial as addition, the computer is no replacement for the human brain. Only the programmer can know what is right and what is wrong.
    The 'math' portion of the computer is far superior to the human brain. The 'implementation' of the math to various situations is most certainly something for the human brain. But any computer, any chip, any circuit can do simple operations and thus complex operations (by combining simple operations) on values. This is proven fact.

    So the computer does NOT make mistake when doing simple operations. In fact the computer never makes a mistake short of the power not being consistent to the CPU and other variations in voltage which would cause issues. But when all is well, the CPU is never wrong.

    Now if you are a programmer and you try to add 40 to 255 and are using an unsigned byte.....well it is you who are wrong, not the computer. The calcuation always works and always does work. As was said in the Titanic movie, 'tis a mathematical certainty.'

  3. #18
    & the hat of GPL slaying Thantos's Avatar
    Join Date
    Sep 2001
    Posts
    5,681
    Transistors are the main component that represent the one's and zero's since they are either on or off.
    Not completely true. If you supply just the right input voltage to a transistor you can get it to sit at 2.5V. Transistors used in digital logic are designed to have a very steep transition slope so it doesn't spend a lot of time in the transitition region. This is also why you have setup and hold times for gates.

    So the computer does NOT make mistake when doing simple operations. In fact the computer never makes a mistake short of the power not being consistent to the CPU and other variations in voltage which would cause issues. But when all is well, the CPU is never wrong.
    Again not completely true. A poorly designed circuit could have some hazards in it. And of course there is the ever popular faults in the circuit. And as an example: Remember the floating point number issue with Pentium 2s?

    Edit: Oh to the topic at hand: There is a reason why a lot of work is done in validating the entire system (software and hardware).

  4. #19
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    And all of those examples are examples of human error, not circuit error. In essence if the circuit is designed correctly and used correctly, it should always have the same output.

    Yes you can cause a transistor to sit at a voltage but this was a discussion of transistors as it relates to digital circuits..hence you would not use a transistor that could stay in this state for very long. And this is essentially what you have already pointed out so we are saying the same thing.

  5. #20
    & the hat of GPL slaying Thantos's Avatar
    Join Date
    Sep 2001
    Posts
    5,681
    And all of those examples are examples of human error, not circuit error
    But the circuits are the computer. So if the circuits are wrong then the computer is wrong. And faults are not always human generated but can occur during manufactoring. Faults can also occur during the usage life of a chip. Imagine what can happen if a 5 input nand gate burns one of its inputs and causes it to fault low.

    The theory and math that the computer is based on won't be wrong, but the computer itself can be wrong.

  6. #21
    Apprentice Swordsman's Avatar
    Join Date
    Apr 2007
    Posts
    38
    Thanks for the help everybody. For those who were interested, I went with a combination of Salem's and Bubba's answers.

    The easiest way of explaining to him was to first tell him why computers are always correct, then to show him that they aren't. He actually understood the concept behind what he was being shown which was realy nice.

    He seems to like the fact that his computer games are made out of 'hard maths' and he is still sitting with me and asking hard questions.

    I want to try and further his development if programming looks like something he will be very interested in, anyone got any good suggestions please?
    Learning languages or reading programming books is a bit beyond his comprehension at the moment, so I'm wondering what else I can do.

  7. #22
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    Let him become familiar with using the computer, of course.

    Also, look for logic and numeric puzzles for him to solve. The problem solving skills involved are very useful in programming.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  8. #23
    Registered User VirtualAce's Avatar
    Join Date
    Aug 2001
    Posts
    9,607
    Learning languages or reading programming books is a bit beyond his comprehension at the moment, so I'm wondering what else I can do.
    Why don't you let him decide that instead of deciding it for him. There is nothing wrong with presenting a challenge even if it's way over his head. If he doesn't respond or begins to get disinterested, which I doubt, then lower the difficulty a bit. Most programming books start out so slow that a 5 year old could follow them. I started programming at age 5 with GW BASIC and those early years really helped me both in school and in later years when I moved on to other languages. Heck I was disassembling DOS 2.10 by age 6 using the good old DEBUG from DOS.

    It is my opinion that our society does not place much of an emphasis on challenges and we tend to underestimate the age at which children can comprehend complex concepts. Hence we end up with the brain dead society we have which is still teaching how to average 5 numbers in a senior high school class. Then moving on to college you program something that can spit out the average grade in a class - whoopee. That is 100% lame. Students should be way beyond averages and simple print this or that programs by high school. By college they should be so far beyond stupid programs that average grades that it's just ridiculous to assign something as simple and stupid as that.

    Challenge him and you may be surprised at what you get. This kid sounds like the kind of person that Microsoft or other computer companies would love to have. Always asking why this or that works and not being satisfied with the typical answers. Sounds like a very very smart kid. Just steer him the right way so he doesn't get interested in hacking or other activities that do nothing for computer science.

    And just think if people started much earlier into complex math....they could save a helluva lot of wasted money on stupid classes in the long term.
    Last edited by VirtualAce; 07-09-2007 at 11:53 PM.

  9. #24
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by Bubba View Post
    I started programming at age 5 with GW BASIC and those early years really helped me both in school and in later years when I moved on to other languages. Heck I was disassembling DOS 2.10 by age 6 using the good old DEBUG from DOS.
    It sounds like a brag, but there ARE those of us out there who started that early. I started writing C64 code when I was 5, and actually ended up getting one of my programs published in a hobby magazine by the time I was 8.

    I do NOT believe this is because I'm a genius. I write absolutely stupid code sometimes just like anybody else. I believe it happened because my parents (Mom in particular) knew it was within the realm of possibility and encouraged me to do it.

    I have seen a 2 year old sit down at a computer and quickly navigate through multiple levels of web pages to get to a flash game she wanted to play. One step involved agreeing to an EULA and clicking through -- this step didn't even slow her down. This is a person who can't even read yet.

    Children, before getting completely screwed up by society, are little geniuses. Treat him like he understands everything you're saying and count on him to ask questions if he doesn't get something. He might blow you away.

  10. #25
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    I have seen a 2 year old sit down at a computer and quickly navigate through multiple levels of web pages to get to a flash game she wanted to play. One step involved agreeing to an EULA and clicking through -- this step didn't even slow her down.
    Yarr, they've got her now!
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. another do while question
    By kbpsu in forum C++ Programming
    Replies: 3
    Last Post: 03-23-2009, 12:14 PM
  2. A question about an interesting Makefile
    By meili100 in forum Tech Board
    Replies: 2
    Last Post: 08-12-2008, 03:56 PM
  3. interesting question about new and nothrow new
    By George2 in forum C++ Programming
    Replies: 4
    Last Post: 01-29-2008, 03:53 AM
  4. Design layer question
    By mdoland in forum C# Programming
    Replies: 0
    Last Post: 10-19-2007, 04:22 AM
  5. Question type program for beginners
    By Kirdra in forum C++ Programming
    Replies: 7
    Last Post: 09-15-2002, 05:10 AM