Hallo,
I have been wondering about this for a while; how do you program a computer? I see how its done as soon as you have something to type on, but how do you go from having a lot of logic gates to being able to install a small program?
Hallo,
I have been wondering about this for a while; how do you program a computer? I see how its done as soon as you have something to type on, but how do you go from having a lot of logic gates to being able to install a small program?
Back in the old day, you have to use plug boards, then they programmed a plug board to read punch cards, and then they used punch cards to program ROM's, then they programmed ROM's to read the prograsmmign from tape. Then they used the tapes to read data directly from keyboards and save them to other tapes. Most people dont appreciate that computers have been around for over 100 years.
It all depends on your definition of "computer", and whether you are talking about the concept or an actual, working implementation.Most people dont appreciate that computers have been around for over 100 years.
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
Start here and keep following the links. It should help.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
Well, the first turing complete general purpose reprogrammable digital computer is over 65 years old. http://en.wikipedia.org/wiki/ENIAC
So it doesnt take much to expand the definition of computer to older designs. Such as http://en.wikipedia.org/wiki/Jacquard_loom
Last edited by abachler; 06-17-2008 at 02:06 PM.
True, but whether these "older designs" are computers in the modern sense is debatable. For example, the abacus is an extremely old design of an aid to human calculation, and this is an important step towards modern computers which aid human calculation in far more powerful ways. So, the abacus can be considered a precursor to the computer, but is it a computer in the modern sense? Likewise, is the Jacquard loom a computer, considering that it does not do actual computation, though its mechanisms have implications in early computing concepts? I would rather draw the line at ENIAC and its contemporaries as the first computers in the modern sense.So it doesnt take much to expand the definition of computer to older designs. Such as http://en.wikipedia.org/wiki/Jacquard_loom
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
1937 is the date of the first digital computer. I cannot remember the name though. It was called computer back then as it is today.
Anything earlier than that, I personally don't call "computer", despite what others (more instructed even) may say. I tend to attribute that to an undesirable over generalization of the term. I wish there were more intent in being precise. They were mechanical devises at most.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
http://en.wikipedia.org/wiki/Z3_%28computer%29
older still
That's 1941, not 1937, and certainly not before 1908.
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)
What about this programmable device?
http://en.wikipedia.org/wiki/Jacquard_loom
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
abachler mentioned it too.
It's a loom, Salem. Not a computer. Jackard Loom. A precursor, a source of inspiration. But its "just" a loom. And an heck of cool invention for its time I'm sure. A mechanical device that made it easier to draws different patterns without much human intervention.
It's not my place to contradict some of the possible opinions on this matter, not crediting myself as an expert or historian. But I do it anyways.
Calling these things computers is not different than calling car a horse driven a wagon. The evolution from the latter to the former was also a natural process and the two always shared similar objectives; to move people from one place to another faster and more comfortably. And yet we don't speak of a wagon as a car. We could extend this to planes and call them cars too. Or bycicles.
I believe it would be possible for many historians to keep an analytical eye and still avoid oversimplifications. Personally I always felt calling these devices computers always took more from them than it gave. They were wonderful inventions that any of us here would struggle to come up on his own without few, if any, backup information to start with. And yet tagging them as computers makes them look like crude ancient artifacts that were just earlier attempts of a weaker mind.
They were not. They were full fledged inventions meant to solve a specific problem at their time.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
I thought the thread was about programmable "things".
s/card/paper tape/core memory/DRAM/....
It's all just
- get the next symbol from the storage medium
- do something
- rinse and repeat.
The only thing which has really changed is the number of symbols in the store, and the speed with which they can be processed.
Or do you consider programmable devices as only things which involve electricity?
What about Babbage's difference engine?
Will your awesome quad-core machine look as impressive in 200 years?
Or will everyone roll around laughing at how slow it is, how big it is, and how much power it consumes, and 1000's of them together still don't have the mental capacity of an ant.
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
Quiet good arguments. But then don't ask me. Ask the writers of the article for instance. Nowhere they call it a computer. You'll also find the exhibits don't call them computers.
Edit: In fact scratch that. You just addressed my issue. It's different points of view. We disagree in that I think the terminology should be less simplistic. If a computer is indeed just a programmable device then I'm afraid my current keyboard is a computer, and I do coffee in a computer and cars aren't just cars, but instead computers. And we fly in computers and, more interesting, a ladybug is a computer.
In 200 years things may as well have different names. That has been always like that. A wagon used to be called a car too. Not anymore.
Last edited by Mario F.; 06-18-2008 at 09:53 AM.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
I think that we need some clarification from h3ro. This thread is about "how do you program a computer?" from scratch. So, we need to know what does h3ro have in mind with "computer", and then whether it is from a historical perspective or with respect to current technology only, or both.I thought the thread was about programmable "things".
I consider the use of electricity to be one of the aspects of a modern computer. It certainly need not be one of the aspects of a programmable device.Or do you consider programmable devices as only things which involve electricity?
Look up a C++ Reference and learn How To Ask Questions The Smart WayOriginally Posted by Bjarne Stroustrup (2000-10-14)