I have been thinking about the nature of hardware and software, and I read that the lowest level 'programmers' were actually hardware designers, in the sense that they had to connect/disconnect wires in order to get the desired output (i.e a sequence of flashing lights or something). Nowadays we are kinda doing the same thing, except we can keep bits in memory and things run extremely fast. I was wondering how the hardware interprets the sequence of bits that it encounters. This is probably something I should be reading up on but I figured i'd try getting a lay men's explanation first.
Thanks.