If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
I can't wait to see what the next 25 years hold.
Good class architecture is not like a Swiss Army Knife; it should be more like a well balanced throwing knife.
- Mike McShaffry
Happy birthday!
And yes. 25 years more of evolution should prove to be jaw dropping. Although I don't feel Moore's Law will hold much longer. I know... I'm an heretic
But looking at the current trends we have all reasons to be excited. Virtual Reality, ID Cards, wearable computers, appliances, entertainment hubs, household hubs,...
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
>> Although I don't feel Moore's Law will hold much longer.
Wouldn't it be interesting if it was broken! Smashed!
I thin it has several times for the past years. We have experienced surges much higher than what Moore predicted. However, Moore's Law is dependant on the laws of physics. It is perfectly acceptable to conceive a theoretical limit. He, himself, said this.
There is of course though a technological pace we can't ignore. And as miniaturization increases (as in things become smaller) and the need for new materials arise, we will most certainly experience a slow down on the development curve. It seems inevitable that we will not be able to keep up with his predictions for much longer.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
I doubt it. Very little has happened in the last ten. A computer with Windows 95, Office 97 and IE4 could (and still does) do everything that the average computer does now (browse internet, watch videos, play 3D games, etc), although slightly slower. Far, far more development occurred in the 10 years before that. The fact is that computer/software development has slowed because most people have been reasonably happy with what a computer can do for several years.25 years more of evolution should prove to be jaw dropping.
Without the drive of gaming, CPU speeds would probably have stabilised by now (and be a lot cheaper). In fact, since the late nineties, most of the new features used on the average computer have been delivered via web applications, many of which would run fine on my 486 DX2-66Mhz with 32MB of RAM running Windows 95.
Dude, don't forget hoverboards.But looking at the current trends we have all reasons to be excited. Virtual Reality, ID Cards, wearable computers, appliances, entertainment hubs, household hubs,...
http://www.hovertech.com/home/index.html
Last edited by Cheeze-It; 08-14-2006 at 08:47 AM.
Staying away from General.
I can agree with this. However it is also possible that certain developments have not been achieved simply because we've hit a few obstacles with the current architecture. The fact that everything happens inside the same box and that ultimately it is a one board providing the transferring of information is probably not what we will want in a near future for household computers.Originally Posted by anonytmouse
Computer farms have been the solution for about anything worth being done these days, from websites to 3D scenes rendering. What a software house can do on their labs and what it can offer to the final consumer is worlds apart due to the constraints a one machine has. Development costs tied to optimize down code to be runnable(sp?) on mainstream personal computers is almost certainly a huge slice of the costs pool.
Even if the consumer is generally happy with the end results, I think the real force behind computer developments have been software companies. And these are certainly not happy.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
> It seems inevitable that we will not be able to keep up with his predictions for much longer.
Sure it will.
Instead of increasing clock speed, we'll just see 2, 4, 8, 16 processors on the same chip. This has already started with the dual cores, I expect more and moore in the future.
Graphics in particular is very heavy on the maths, and relatively easy to make concurrent. Just imagine each pixel was a processor rather than a few bytes in memory
Also there are quite a few rounds of Moore's law to catch up to these chips.
http://news.bbc.co.uk/1/hi/technology/5099584.stm
If you dance barefoot on the broken glass of undefined behaviour, you've got to expect the occasional cut.
If at first you don't succeed, try writing your phone number on the exam paper.
Well we are close. Pixel shaders are just that. Each pixel is sort of a processor - the interpolated pixel coords are sent to the shader from the vertex shader.Graphics in particular is very heavy on the maths, and relatively easy to make concurrent. Just imagine each pixel was a processor rather than a few bytes in memory
>> I thin it has several times for the past years....
Oh, I know that! What I meant was that it would be interesting if somehow they made a Moor's (hypothesis is a good word), law II which was exponential.
>> Instead of increasing clock speed, we'll just see 2, 4, 8, 16 processors on the same chip.
Imagine 42
>> Just imagine each pixel was a processor rather than a few bytes in memory
Last edited by twomers; 08-14-2006 at 03:26 PM.
Cheaters!
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
Computers not only will change, how we use them will change, soon we may become dependant on them for survival.
Why is it that everytime we get somthing good society has to exploit it and ruin it? I love computers they are my life, since I was 5. But there are things computers shouldnt be used for and it will happen, soon our lives will be run by the computer...Then something will go wrong, And you think the Great Depression was bad?
Either way, I do look forward to the future of programming and more specificly game programing, I always want something bigger and better then what the worlds seen thus far, I cant wait to explore further