Something seems wrong... Technology-wise
by Jonathan Rose · in General Discussion · 04/21/2004 (3:03 pm) · 23 replies
Moore's law states that every 1.5 years, the average processor speed will double correct? Well... has anyone taken note that this hasn't happened over the last 1.5 years? a computer purchase I made 2 and one half years ago was a pentium 4 2.0 ghz processor with hardware specs to match (256 meg ram, horrid integrated graphics chip, etc.) and that cost 600 dollars... More recently (1 week ago) I purchased a new coumputer... with 2.8 ghz and 512 megs of ram and integrated graphics chip (vomits) etc. for 750 dollars. What is up with this? Any light shined on this area would be greatly appreciated.
perhaps in Intel released Pentium 5, things would change, but I don't see that coming anytime soon (I've seen no released information on the subject at all)
AMD Athalon 64 bit processors seem to be less than optimal speed as well. Good for gaming, but still not as fast as one would expect.
The usual trend on technological improvement being broken worries me quite a bit...
perhaps in Intel released Pentium 5, things would change, but I don't see that coming anytime soon (I've seen no released information on the subject at all)
AMD Athalon 64 bit processors seem to be less than optimal speed as well. Good for gaming, but still not as fast as one would expect.
The usual trend on technological improvement being broken worries me quite a bit...
#2
And according to Moore's 'Theory', it's not the mhz that doubles every 1.5 years, it's the computational capabilities. For example, when Intel introduces new technology like extended instruction sets, etc, the computational capabilities increases without having to increase the mhz. Other things can increase the computational abilities as well, like better video processors, faster ram, faster or increased cache sizes, faster spinning hard drives, etc. Bottom line, the way I understand it, the overall system speed and throughput is 'supposed' to increase at Moore's rate, not just the mhz.
That's just my current understanding, though. If I'm wrong, someone please let me know.
EDIT--- Jay beat me to it. Sorry.
04/21/2004 (3:17 pm)
I'll be honest, it kind of ticks me off when Moore's Law is refered to as a Law. It is not. It's nothing more than an educated guess. A theory. Not a law.And according to Moore's 'Theory', it's not the mhz that doubles every 1.5 years, it's the computational capabilities. For example, when Intel introduces new technology like extended instruction sets, etc, the computational capabilities increases without having to increase the mhz. Other things can increase the computational abilities as well, like better video processors, faster ram, faster or increased cache sizes, faster spinning hard drives, etc. Bottom line, the way I understand it, the overall system speed and throughput is 'supposed' to increase at Moore's rate, not just the mhz.
That's just my current understanding, though. If I'm wrong, someone please let me know.
EDIT--- Jay beat me to it. Sorry.
#3
it will take a nuclear reactor to power our current method of transistor configuration's.
and it will put out more heat than the sun.
it is time to redesign some stuff Before this is necessary.
04/21/2004 (3:28 pm)
In ten years.it will take a nuclear reactor to power our current method of transistor configuration's.
and it will put out more heat than the sun.
it is time to redesign some stuff Before this is necessary.
#4
04/21/2004 (3:53 pm)
In any case, processor speed improvement is slowing down... a lot.
#5
04/21/2004 (5:34 pm)
Well if it makes you feel any better, you could have gotten a 3.4 GHZ currently.
#6
04/21/2004 (5:48 pm)
Whereas 2 years ago, I could have gotten a 2.8 no? not making me feel any better...
#7
Just because you bought an older chip doesn't mean that there are not chips 2x as peformant as what you had.
You can't use mhz as a performance indicator, the AthlonFX as well as the G5 and the Power5 chips run at a lower clock but do 2x - 3x as much work as a creakly old 2.8 Pentium4! Hell they even out preform the 3.4 Pentium4 at a gigahertz less clock!
04/21/2004 (6:10 pm)
GPU's are about 10x more powerful than 2 years ago. CPU's are about 1.8x - 2x as performant as two years ago, but are not progressing as fast because of business decisions more than anything else. Mainstream CPU's are fast enough for 99.999% of the general population.Just because you bought an older chip doesn't mean that there are not chips 2x as peformant as what you had.
You can't use mhz as a performance indicator, the AthlonFX as well as the G5 and the Power5 chips run at a lower clock but do 2x - 3x as much work as a creakly old 2.8 Pentium4! Hell they even out preform the 3.4 Pentium4 at a gigahertz less clock!
#8
04/21/2004 (6:12 pm)
I'm just ticked that my 256MB video card hasn't significantly improved the performance of scorched earth over my VGA 486. What's it going to take?
#9
Compare the processing power of the latest video chips from nVidia and add that to the latest processing power of a 64 processor and I think Mr. Moore's Hypothesis (you're right its not a Law) falls into place.
Most people forget that the CPU used to do many of the rendering tasks now assigned to the GPU chip. It only makes sense to take the old CPU/GPU combination and compare that to the current CPU/GPU. Also I ould check out the speed of motherboard transfer rates which has jumped substantially, and the theoretical amount of processing that can be done with a 64 bit vs. 32 bit processor.
I think Mr. Moore is still safe in his Hypothesis for a bit longer. However, I do think that things are beginning to become more distributed which will make it increasingly hard to measure performance increases.
We are approaching some theoretical limits with things such as IBM's lightgate switching. So I do not see the trend continuing without an expansion on the definition of performance.
04/21/2004 (6:24 pm)
You may be looking at the wrong chip...Compare the processing power of the latest video chips from nVidia and add that to the latest processing power of a 64 processor and I think Mr. Moore's Hypothesis (you're right its not a Law) falls into place.
Most people forget that the CPU used to do many of the rendering tasks now assigned to the GPU chip. It only makes sense to take the old CPU/GPU combination and compare that to the current CPU/GPU. Also I ould check out the speed of motherboard transfer rates which has jumped substantially, and the theoretical amount of processing that can be done with a 64 bit vs. 32 bit processor.
I think Mr. Moore is still safe in his Hypothesis for a bit longer. However, I do think that things are beginning to become more distributed which will make it increasingly hard to measure performance increases.
We are approaching some theoretical limits with things such as IBM's lightgate switching. So I do not see the trend continuing without an expansion on the definition of performance.
#10
http://www.wired.com/wired/archive/12.04/start.html?pg=2
04/21/2004 (7:03 pm)
As Badguy pointed out, the performance barrier will not come from being unable to pack chips full of transistors, but rather finding a way to give these power hungry chips plenty of it. There was a decent wired article about this a bit back, lemme dig it up...http://www.wired.com/wired/archive/12.04/start.html?pg=2
#11
It has everything to do with transistor count in processors, and frankly, nVidia is carrying the law on thier shoulders at this point. The new gpus from them have a larger transistor count and die size then a Intel Pentium 4 EE (which has the 2 meg L3 cache on the die, which gives it one hell of a large die size)
"Weighing in at a hefty 222 Million transistors, NVIDIA's newest GPU has more than three times the number of transistors as Intel's Northwood P4, and about 33% more transistors than the Pentium 4 EE."
Just some other numbers here for transistor counts:
IBM PPC 970 = 55M
Intel P4 "Northwood" = 58M -> P4 "Prescott" = 125M
Without the L2 cache the numbers are:
IBM PPC 970 = 25M
Intel P4 "Northwood" = 28M -> P4 "Prescott" = 65M
-edit-
Oh, as far as what David said, I think the problem will be that a single processor will not show the true transistor numbers. What happens when we no longer sell single processor systems (and Apple will probably be the first to do that for Desktops) and then you need to add both processor counts. The north bridge/south bridges also do more chores for the CPU now (as the interfaces progress), GPUs do the same in an even more intensive way.
04/21/2004 (7:59 pm)
Moore's Law had NOTHING to do with performance.It has everything to do with transistor count in processors, and frankly, nVidia is carrying the law on thier shoulders at this point. The new gpus from them have a larger transistor count and die size then a Intel Pentium 4 EE (which has the 2 meg L3 cache on the die, which gives it one hell of a large die size)
"Weighing in at a hefty 222 Million transistors, NVIDIA's newest GPU has more than three times the number of transistors as Intel's Northwood P4, and about 33% more transistors than the Pentium 4 EE."
Just some other numbers here for transistor counts:
IBM PPC 970 = 55M
Intel P4 "Northwood" = 58M -> P4 "Prescott" = 125M
Without the L2 cache the numbers are:
IBM PPC 970 = 25M
Intel P4 "Northwood" = 28M -> P4 "Prescott" = 65M
-edit-
Oh, as far as what David said, I think the problem will be that a single processor will not show the true transistor numbers. What happens when we no longer sell single processor systems (and Apple will probably be the first to do that for Desktops) and then you need to add both processor counts. The north bridge/south bridges also do more chores for the CPU now (as the interfaces progress), GPUs do the same in an even more intensive way.
#12
The fact remains is that cost hasn't gone down according to better technology. Prices of older technology do not seem to be going down at the rate they should be, and the only explanation for such an occurence that seems reasonable is that the technology isnt improving as rapidly as it once was. I have to say though that you exagerated those performance boosts, as the benchmark tests Ive seen for many of those processors you were referring to did not show such a large increase. I do agree that it is mostly business decisions that are most likely causing this recent technological recession.
On the final note about Moore's law, I had always heard that it did involve performance as well, so uh... I guess I was wrong. But that doesn't change the fact that computer technology isn't upgrading as fast as it was 5 years ago by any stretch of the imagination.
04/22/2004 (1:21 pm)
"Just because you bought an older chip doesn't mean that there are not chips 2x as peformant as what you had."The fact remains is that cost hasn't gone down according to better technology. Prices of older technology do not seem to be going down at the rate they should be, and the only explanation for such an occurence that seems reasonable is that the technology isnt improving as rapidly as it once was. I have to say though that you exagerated those performance boosts, as the benchmark tests Ive seen for many of those processors you were referring to did not show such a large increase. I do agree that it is mostly business decisions that are most likely causing this recent technological recession.
On the final note about Moore's law, I had always heard that it did involve performance as well, so uh... I guess I was wrong. But that doesn't change the fact that computer technology isn't upgrading as fast as it was 5 years ago by any stretch of the imagination.
#13
But you are not buying cutting edge hardware anyway, what do you care. Surf the two year old curve and enjoy your $100 processor that was $500 2 years ago.
04/22/2004 (1:44 pm)
Believe what you want, Jonathan . . . but your opinion is not reflecting reality. There was an article late last year that stated that not only was Moore's "Law" still valid, but it was probably too conservative just because of the Power5 and AMD64 FX chips! The 64 bit processors are as performant as I said when running with the OS and software and hardware. They are MUCH more efficient than the brute force crap that is shoveled out of Intel.But you are not buying cutting edge hardware anyway, what do you care. Surf the two year old curve and enjoy your $100 processor that was $500 2 years ago.
#14
I think what we're seeing is a slowdown on the CONSUMER side of things. You may be capable of making a machine that is 8x or more powerful in terms of equivalent solid-state components. But will people buy it?
Remember - five years ago we were at the high point of the 'dot com boom.' Companies were staffing up IT departments and buying new systems up the wazoo. You had a bunch of geeks with more money than brains who could afford to upgrade their home systems every six months so they could be more competitive in their Unreal or Quake II games. Three years later, those same guys were unemployed, and all those expensive new machines companies had been snatching up like crazy were getting dumped on E-Bay for pennies on the dollar.
As far as CPU's are concerned - throughout the 90's, Intel and AMD and others knew that, to a large degree, games and multimedia applications drove hardware sales. How many 486's did DOOM sell? How many Pentiums did Quake sell? *PLENTY* IN the late 1990's, this started to change due to the acceptance of 3D cards. But even in early 1999, a certain report stated that the adoption of 3D hardware was *STILL* far below what was expected, and games being developed that were exclusively for 3D hardware were hurting in sales. This scared publishers to death, and they immediately began demanding that current projects ALL have software 3D support.
Well, that's changed. Your mainstream gamer HAS 3D hardware now... and that's bearing the load of his technological burden. His CPU isn't the bottleneck anymore. As a result, the demand for the top-end, 'bleeding edge' CPUs has diminished. I *expect* the demand on 3D hardware is going to go down soon, too, if it hasn't already... simply because aside from anything revolutionary occuring that changes visual quality (like pixel-shaders, MAYBE) - the quality is getting so good that a major increase in performance or quality goes unnoticed in the eye of the user.
So it's not technology that's changing. Simply the economy.
04/22/2004 (1:52 pm)
Depends upon where you stretch it. You look at the latest GRAPHICS hardware, and it sure looks like it's equalling or exceeding Moore's law.I think what we're seeing is a slowdown on the CONSUMER side of things. You may be capable of making a machine that is 8x or more powerful in terms of equivalent solid-state components. But will people buy it?
Remember - five years ago we were at the high point of the 'dot com boom.' Companies were staffing up IT departments and buying new systems up the wazoo. You had a bunch of geeks with more money than brains who could afford to upgrade their home systems every six months so they could be more competitive in their Unreal or Quake II games. Three years later, those same guys were unemployed, and all those expensive new machines companies had been snatching up like crazy were getting dumped on E-Bay for pennies on the dollar.
As far as CPU's are concerned - throughout the 90's, Intel and AMD and others knew that, to a large degree, games and multimedia applications drove hardware sales. How many 486's did DOOM sell? How many Pentiums did Quake sell? *PLENTY* IN the late 1990's, this started to change due to the acceptance of 3D cards. But even in early 1999, a certain report stated that the adoption of 3D hardware was *STILL* far below what was expected, and games being developed that were exclusively for 3D hardware were hurting in sales. This scared publishers to death, and they immediately began demanding that current projects ALL have software 3D support.
Well, that's changed. Your mainstream gamer HAS 3D hardware now... and that's bearing the load of his technological burden. His CPU isn't the bottleneck anymore. As a result, the demand for the top-end, 'bleeding edge' CPUs has diminished. I *expect* the demand on 3D hardware is going to go down soon, too, if it hasn't already... simply because aside from anything revolutionary occuring that changes visual quality (like pixel-shaders, MAYBE) - the quality is getting so good that a major increase in performance or quality goes unnoticed in the eye of the user.
So it's not technology that's changing. Simply the economy.
#15
04/22/2004 (6:03 pm)
Even with PCI express and chips like the new nvidia 6800? I believe we are in for a huge leap in 3D technology (at least compared to the slow increases recently).
#16
Too bad monitors don't fall into this category :( They seem to be the biggest embarrasement as for progress in the computer field.
Oh and BTW, don't forget that an application's frame-rate has upper limits which is a lot worse than you would expect! The HZ of the monitor effects the framerate of say those 3-D apps you use. This is not a joke. So while the cpu, gpu may be leaping in leaps and bounds, we are still stuck with a rediculously low max theoretical frame-rate due to this little bottleneck piece of technology that refuses to catch up.
This doesn't make sence to some, but try to play a high-framerate game on a 60 Hz monitor, then 72 Hz, then 85 Hz, you will notice the difference!
04/22/2004 (9:31 pm)
I had often read Moor's law is based on a 2 year time-line, and not 1.5? Perhaps the original thesis was based on a 1.7-1.8 year time-line?Too bad monitors don't fall into this category :( They seem to be the biggest embarrasement as for progress in the computer field.
Oh and BTW, don't forget that an application's frame-rate has upper limits which is a lot worse than you would expect! The HZ of the monitor effects the framerate of say those 3-D apps you use. This is not a joke. So while the cpu, gpu may be leaping in leaps and bounds, we are still stuck with a rediculously low max theoretical frame-rate due to this little bottleneck piece of technology that refuses to catch up.
This doesn't make sence to some, but try to play a high-framerate game on a 60 Hz monitor, then 72 Hz, then 85 Hz, you will notice the difference!
#17
but increased peformance does not automatcially mean you need more transistors and more transistors does not mean you automatically get more performance.
The Man Himself
04/23/2004 (7:01 am)
Moores "Law" is not about CPU mhz it is more about the general packing of data. Increased peformance is a side effect of this.Quote:
Moore's Law /morz law/ prov. The observation that the logic density of silicon integrated circuits has closely followed the curve (bits per square inch) = 2^(t - 1962) where t is time in years; that is, the amount of information storable on a given amount of silicon has roughly doubled every year since the technology was invented. This relation, first uttered in 1964 by semiconductor engineer Gordon Moore (who co-founded Intel four years later) held until the late 1970s, at which point the doubling period slowed to 18 months. The doubling period remained at that value through time of writing (late 1999). Moore's Law is apparently self-fulfilling. The implication is that somebody, somewhere is going to be able to build a better chip than you if you rest on your laurels, so you'd better start pushing hard on the problem. Most experts, including Moore himself, expect Moore's Law to hold for at least another two decades.
but increased peformance does not automatcially mean you need more transistors and more transistors does not mean you automatically get more performance.
The Man Himself
#18
Here's a decent site:
http://www.pcguide.com/ref/mbsys/buses/funcHierarchy-c.html
And yes, moore's law is solely about transistors, which has little to do with performance or price. The engineers still have to utilize the transistors well... I think they've already been reaching the limits of wavelengths for the burning process for chips.
anyways, overall moore's law doesnt mean much to us end users.
-s
04/28/2004 (2:51 pm)
I just finished a class on this stuff and had to do a research paper.. Overall, the biggest bottlenecks right now are the busses and memory speeds. Right now modern motherboards have a "bus heirarchy" with various speeds of busses depending on their function. Until we can get faster and wider busses and memory we will not be able to really feel the impact of increased raw Mhz. My P4 1 ghz is not that much slower than my gf's 2.6 in overall use. Why? the CPU is not exhausted. Sure, hers would kick butt on SIMD ops or FPU stuff, but in overall varyed use its not proportinately faster. Here's a decent site:
http://www.pcguide.com/ref/mbsys/buses/funcHierarchy-c.html
And yes, moore's law is solely about transistors, which has little to do with performance or price. The engineers still have to utilize the transistors well... I think they've already been reaching the limits of wavelengths for the burning process for chips.
anyways, overall moore's law doesnt mean much to us end users.
-s
#19
anymore--heheh. 15 years ago, rending a single frame on state of the art desktop platforms at not even half the resolution available today took 45+ minutes.
It -really- is amazing if you've been in the industry long enough to even be able to fathom the generational leaps that have been made.
Real time ray tracing? You have GOT to be kidding (in 1987)
04/28/2004 (5:23 pm)
Quote: anyways, overall moore's law doesnt mean much to us end users.anymore--heheh. 15 years ago, rending a single frame on state of the art desktop platforms at not even half the resolution available today took 45+ minutes.
It -really- is amazing if you've been in the industry long enough to even be able to fathom the generational leaps that have been made.
Real time ray tracing? You have GOT to be kidding (in 1987)
#20
---------------
Both said by Bill Gates @ Microsoft. :)
04/28/2004 (5:36 pm)
Heya guys. Thought I'll add this;Quote:
640kb ought to be enough for anybody.
---------------
Quote:
WWW? Nice toy, but what a waste of time.
Both said by Bill Gates @ Microsoft. :)
Torque 3D Owner Jay Barnson
But as far as raw processing power --- I dunno. One thing you have to bear in mind is that we're still coming out of a recession, and the last 2-3 years has been something of a slump in sales of computer hardware. This means that manufacturers have been less aggressive with introducing new product overall.
Ultimately, Moore's Law is really just an analysis of an old trend rather than a law, a trend which most likely WILL reach physical limits unless a radical new technology (like quantum computing) is adopted as a standard.
I wouldn't fret about it too much.