Thread: The inner-workings of a uP

  1. #16
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    As for nuclear spin, well that would yield more than 2 states.
    Yes, but so does voltage. Nuclear spin is special in that you can have superpositions, but you don't necessarily have to have them, in which case it's a perfectly valid way of storing a binary decision.

    But it's not straightforward.
    Yes, it is. You classify the possible states of the basic element of your medium (a magnetic particle, a dot on the optical medium, a time interval on the electrical line, a single atom, ...) into three classes: 0, 1, and (possibly) invalid. Then you build the technical means to measure and set these, with voltage as the common language of all such components. Now, these technical means may be extremely involved, very complex, and all that, but in the end it still just translates one representation into another. Measuring magnetism should be your area of expertise, not mine.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  2. #17
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    I don't think we can answer the question, "What's exactly going on in this device which is composed of over 1 billion parts" to the satisfaction you want. Go get a book on basic digital electronics.
    Code:
    //try
    //{
    	if (a) do { f( b); } while(1);
    	else   do { f(!b); } while(1);
    //}

  3. #18
    PhysicistTurnedProgrammer Cell's Avatar
    Join Date
    Jan 2009
    Location
    New Jersey
    Posts
    72
    Quote Originally Posted by CornedBee View Post
    Yes, it is. You classify the possible states of the basic element of your medium (a magnetic particle, a dot on the optical medium, a time interval on the electrical line, a single atom, ...) into three classes: 0, 1, and (possibly) invalid. Then you build the technical means to measure and set these, with voltage as the common language of all such components. Now, these technical means may be extremely involved, very complex, and all that, but in the end it still just translates one representation into another. Measuring magnetism should be your area of expertise, not mine.
    Sure, but the question is how do the symbol classes get transfered to physical voltages? The fact that we CAN or DO abstract signal levels and represent them as symbols is not the issue here. The example we've been using the the HDD.


    Quote Originally Posted by brewbuck View Post
    I don't think we can answer the question, "What's exactly going on in this device which is composed of over 1 billion parts" to the satisfaction you want. Go get a book on basic digital electronics.
    As I have mentioned, I have taken plenty of digital logic/system/electronics courses. I am not asking how the signals get manipulated by logic gates.

    I am not asking you to explain how to build gates from transistors, or to build an architecture block from gates, or to build a uP from the architecture blocks.

    I have done transistor layouts before - I know how voltage signals pass through transistors. I am not asking how to represent the signals or the voltages - I understand that voltage levels are abstracted and represented as binary symbols.

    I am not asking you to explain what is going on in a device of over a billion transistors.

    The question: A logic 1 today might equal 3.3 volts. These 3.3 volts will open the channel on a single transistor to allow voltage to run through the device. The machine code includes many 1's and 0's that control the operation of these huge, billion transistor devices. Good - understood. However, how does the logic 1 actually translate to 3.3 volts?
    .
    .

  4. #19
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by Cell View Post
    The question: A logic 1 today might equal 3.3 volts. These 3.3 volts will open the channel on a single transistor to allow voltage to run through the device. The machine code includes many 1's and 0's that control the operation of these huge, billion transistor devices. Good - understood. However, how does the logic 1 actually translate to 3.3 volts?
    In the case of RAM, the logical 1 is represented as a charge stored on a cell of memory. This charge produces a voltage. When this voltage is physically connected to a transistor's gate, it causes the transistor to conduct.

    I think it would be more profitable to consider logic gates (AND, NAND, etc) instead of individual transistors, since almost all of the circuitry is designed in terms of gates, not transistors.

    But I get the feeling you're going to continue to drill down until we're talking about what an electron is, what voltage is, etc. I'm happy to talk about it, I don't know how productive it will be to your understanding.
    Code:
    //try
    //{
    	if (a) do { f( b); } while(1);
    	else   do { f(!b); } while(1);
    //}

  5. #20
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    During this step, something occurs that translates our abstractions (machine code) to physical signals (voltages).
    No, that's a misconception. You're working with a computer. It's always physical signals. The abstractions are only in human minds - primarily yours, of course, but also of the guys who designed the computer, who designed and wrote the compiler, etc.

    It's always about physical signals. It starts with the physical signals in your brain eventually leading to physical signals down your nerves to stimulate the muscles that make you hit the keyboard. The keyboard translates the hits into electrical signals by simple closing of circuits. These signals go over the bus into the computer, where they are interpreted and mapped, the result being source code in RAM, on screen, and eventually on the hard disk. Still physical signals, though. The compiler then does some serious work with these physical signals, the result being the physical signals that make up the machine code.
    It's always this low-level. It cannot leave this plane. You may think in the abstract, but it's always just a higher-level view of the low-level stuff that's going on anyway.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  6. #21
    PhysicistTurnedProgrammer Cell's Avatar
    Join Date
    Jan 2009
    Location
    New Jersey
    Posts
    72
    Quote Originally Posted by brewbuck View Post
    In the case of RAM, the logical 1 is represented as a charge stored on a cell of memory. This charge produces a voltage. When this voltage is physically connected to a transistor's gate, it causes the transistor to conduct.

    I think it would be more profitable to consider logic gates (AND, NAND, etc) instead of individual transistors, since almost all of the circuitry is designed in terms of gates, not transistors.

    But I get the feeling you're going to continue to drill down until we're talking about what an electron is, what voltage is, etc. I'm happy to talk about it, I don't know how productive it will be to your understanding.
    Well I know what an electron is and what voltage is - as I've mentioned I have a background in physics. I have taken computer architecture courses, digital systems/logic courses, etc.

    We can discuss it in terms of gates, that's more than fine, but at that point we are already past my question. We are talking about the inputs to the gates generated by the source codes we compile.

    I also understand the 6T memory cell. And I know how a capacitor stores and discharges it's energy.

    There is a small step when the machine code is loaded into RAM that the binary abstractions represent their physical value. Which leads to my response to the next post:

    Quote Originally Posted by CornedBee View Post
    No, that's a misconception. You're working with a computer. It's always physical signals. The abstractions are only in human minds - primarily yours, of course, but also of the guys who designed the computer, who designed and wrote the compiler, etc.

    It's always about physical signals. It starts with the physical signals in your brain eventually leading to physical signals down your nerves to stimulate the muscles that make you hit the keyboard. The keyboard translates the hits into electrical signals by simple closing of circuits. These signals go over the bus into the computer, where they are interpreted and mapped, the result being source code in RAM, on screen, and eventually on the hard disk. Still physical signals, though. The compiler then does some serious work with these physical signals, the result being the physical signals that make up the machine code.
    It's always this low-level. It cannot leave this plane. You may think in the abstract, but it's always just a higher-level view of the low-level stuff that's going on anyway.
    It's not a misconception, though. At a high level explanation, sure, it's all about physical signals. But when we compile source code, it becomes represented as an abstraction of physical signals.

    There is a disconnect from the point where my fingers enter code onto a screen to when the code executes on a microprocessor that is purely an abstraction.

    How do those binary symbols charge the capacitors in a RAM slice?

  7. #22
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    There is no step between the abstract and the physical. There cannot be. You cannot have the abstract without the physical. It's just a way of thinking about the physical without getting hung up in the details. But the physical is there. You cannot convert the abstract to the physical, or vice versa, with any device. This conversion is a thought process (the classification I talked about earlier), not a physical process.

    In programming terms, you cannot instantiate an abstract class.

    When we compile source code, what we get back is physical signals. Charges of the flip-flops that make up a uP's registers, the cache, of the elements of the RAM and eventually the HDD. You only think about them as zeros and ones. What they really are is charges.

    In other words (and without any insult intended), this sentence:
    But when we compile source code, it becomes represented as an abstraction of physical signals.
    is complete nonsense. It's a failure to see the whole picture, to see the physical and the abstract at the same time and recognize that they are one and the same.
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  8. #23
    PhysicistTurnedProgrammer Cell's Avatar
    Join Date
    Jan 2009
    Location
    New Jersey
    Posts
    72
    Quote Originally Posted by CornedBee View Post
    There is no step between the abstract and the physical. There cannot be. You cannot have the abstract without the physical. It's just a way of thinking about the physical without getting hung up in the details. But the physical is there. You cannot convert the abstract to the physical, or vice versa, with any device. This conversion is a thought process (the classification I talked about earlier), not a physical process.
    I'm not say there is a gap between the abstract and the physical - I'm saying there is a step where the abstract is MADE physical.

    For example, consider an analog to digital converter. The inputs to the ADC are physical, they get sampled and digitized, in other words- digital abstractions that correlate to real world signals.

    I am not saying there is a step between the abstract and the physical - I quite know they are mapped to each other - but I'm talking about the process of making the machine code physical.


    Quote Originally Posted by CornedBee View Post
    In other words (and without any insult intended), this sentence:

    is complete nonsense. It's a failure to see the whole picture, to see the physical and the abstract at the same time and recognize that they are one and the same.
    No insult taken at all. I'm glad to be discussing this. I hope this extends to everyone - I don't want to seem like I'm arguing here - just discussing something I find interesting.

    However, that sentence is not nonsense. The compiled source code is 'broken down' to machine code. Those logical 1's and 0's represent voltages. That's all I was saying about that. I know there is a mapping between our binary abstraction and real world voltages.

  9. #24
    Kernel hacker
    Join Date
    Jul 2007
    Location
    Farncombe, Surrey, England
    Posts
    15,677
    Am I the only one that is now lost as to what this discussion is about?

    If you know how a transistor works, and how to make (for example) an AND gate from a set of transistors, and you understand that there is a correlation between the individual physical signal and the 0 and 1 abstraction that we use to make more sense to humans, what part is not understood?

    --
    Mats
    Compilers can produce warnings - make the compiler programmers happy: Use them!
    Please don't PM me for help - and no, I don't do help over instant messengers.

  10. #25
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    I'm with matsp. I'm completely lost as to what you don't understand. I don't feel like there is a hole in your understanding. I think you're imagining a problem where there is none.

    You could look up the instruction set reference of the CPUs. It tells you what each assembly opcode does, and how it is encoded in binary - often with different binary representations depending on the kind of argument (registers, memory operands, immediate operands). Perhaps that's what you're wondering about?
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

  11. #26
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by Cell View Post
    There is a disconnect from the point where my fingers enter code onto a screen to when the code executes on a microprocessor that is purely an abstraction.
    Where is the disconnect? Between the moment your finger presses a key and the moment an instruction executes, there was an unbroken flow of information in the form of physical, electrical signals. A processor is not an abstraction.

    It's almost like you're asking "Where does the 'virtualness' come from?" The answer is there is no virtualness. This is all quite real.
    Code:
    //try
    //{
    	if (a) do { f( b); } while(1);
    	else   do { f(!b); } while(1);
    //}

  12. #27
    PhysicistTurnedProgrammer Cell's Avatar
    Join Date
    Jan 2009
    Location
    New Jersey
    Posts
    72
    I'm probably just making up problems for myself by trying to look too far into things. I've done that before.

    Thanks for the discussion, guys. It's nice that I didn't have to go make an ass of myself in front of a professor!

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 3
    Last Post: 07-03-2009, 08:02 AM
  2. Inner Workings of Vector Graphics?
    By thetinman in forum Tech Board
    Replies: 2
    Last Post: 12-18-2008, 05:44 PM
  3. Internal Workings Of The = Operator?
    By Geolingo in forum C++ Programming
    Replies: 3
    Last Post: 10-10-2003, 11:07 PM
  4. inner workings of win API functions
    By SAMSAM in forum Windows Programming
    Replies: 1
    Last Post: 02-23-2003, 06:17 PM