On the Horizon: Physics Processing Unit (PPU)
by Stephen Zepp · in General Discussion · 03/08/2005 (3:58 pm) · 27 replies
Something to think about for those games with 2+ years dev cycle: Ageia Physx.
#22
I have a feeling that, not only will networking be a nightmare with different physics cards (wtf... physics cards? What a way to piss off the market... now they have to buy physics math cards in addition to video cards!) that programming will be a pain in the ass as well. Compatibility is one thing for a game when you are trying to make the graphics adapt to non-pixel shader cards when making a game intended to support pixel shaders. Add in this, and you have to program a different physics engine for non-physics card users... and imagine having to adapt your designs to multiple physics engines... it would be a nightmare :M
09/09/2005 (1:42 pm)
Wow... it seems that the answer these days to every problem, be it graphical or gameplay these days is... throw in another processor... only... one thats very specialized!I have a feeling that, not only will networking be a nightmare with different physics cards (wtf... physics cards? What a way to piss off the market... now they have to buy physics math cards in addition to video cards!) that programming will be a pain in the ass as well. Compatibility is one thing for a game when you are trying to make the graphics adapt to non-pixel shader cards when making a game intended to support pixel shaders. Add in this, and you have to program a different physics engine for non-physics card users... and imagine having to adapt your designs to multiple physics engines... it would be a nightmare :M
#23
I'm sure there are more than a few here who remember the discussions that raged about how useless 3D graphics acceleration would be because of various limitations they introduced in the way scenes could be managed, the way some contemporary (at the time) "hacks" were not compatible with more mathematically strict 3D, etc etc. Not to mention how people were reticent that no one would pay a few hundred bucks for something like that! When 3DFX came along with a better renderer/API people adopted it quite quickly. Then other companies got competitive, and where are we now? ATI or NVidia? I remember how freaked out I was the first time I had two cards running in SLI (man was that expensive). Enter John Carmack and his push for better tech, communicating what developers needed to see on those cards to deliver a better end-user experience. Then Epic, with their bone-crunching resource requirements (more state-changes, higher rez textures, etc).
How necessary is a 3D accelerator in creating a good, fun game? How necessary is 256MB of RAM? How necessary is it to have a > 500 Mhz processor? Now raise your hands if your machine meets all these criteria, and you're trying to develop a game that exploits at least a good percentage of that. Those are fairly conservative specs these days - my MOM has a faster machine than that (Solitaire just screams!)!
All I'm saying is that love of technology is a great thing. Otherwise we'd all be fighting with segmented memory models and extenders on monochrome graphics. Unless I were employed by any company making PPU's (in which case I'd be a little more aggressive in my excitement), I'd consider myself fairly excited to see something else in our favorite little boxes attempting its Darwinian birth cycle.
Whether it succeeds or fails is up to developers (like us) to enable this technology and customers who do or don't care about it to buy it. Maybe processor speeds will eclipse market adoption because no one develops on it in time, maybe they'll work out some strategic partnership with a graphics card or motherboard company soon after its introduction, maybe someone will get the next idea for the next generation hit that we haven't seen since Wolf3D or Doom I. Dream or doubt?
I know I just spewed a lot of subjective philosophy, but the point I was meandering towards was that I would expect to see a lot more positive energy from developers on this. Personally at this point, I'll buy one when they come out, and I'll code some demos, prototypes, maybe even a game idea or several. I'm getting too old to bet the farm on them succeeding, but if I were a little younger with time/energy to burn, I'd look at the discussions right here and have some wheels turning. There are lots of problems to be solved and you have two basic paths open to you: be creative and develop some solutions that circumvent the scientific/mathematical/traditional routes, or put those math degrees/obsessions to good use.
Anyway, this should have been an editorial in a dev magazine or something....
Now where is my fully integrated desktop holodeck?
09/09/2005 (2:12 pm)
Any speculation on whether these things will sell at this point is entirely...errr, speculation. I'm sure there are more than a few here who remember the discussions that raged about how useless 3D graphics acceleration would be because of various limitations they introduced in the way scenes could be managed, the way some contemporary (at the time) "hacks" were not compatible with more mathematically strict 3D, etc etc. Not to mention how people were reticent that no one would pay a few hundred bucks for something like that! When 3DFX came along with a better renderer/API people adopted it quite quickly. Then other companies got competitive, and where are we now? ATI or NVidia? I remember how freaked out I was the first time I had two cards running in SLI (man was that expensive). Enter John Carmack and his push for better tech, communicating what developers needed to see on those cards to deliver a better end-user experience. Then Epic, with their bone-crunching resource requirements (more state-changes, higher rez textures, etc).
How necessary is a 3D accelerator in creating a good, fun game? How necessary is 256MB of RAM? How necessary is it to have a > 500 Mhz processor? Now raise your hands if your machine meets all these criteria, and you're trying to develop a game that exploits at least a good percentage of that. Those are fairly conservative specs these days - my MOM has a faster machine than that (Solitaire just screams!)!
All I'm saying is that love of technology is a great thing. Otherwise we'd all be fighting with segmented memory models and extenders on monochrome graphics. Unless I were employed by any company making PPU's (in which case I'd be a little more aggressive in my excitement), I'd consider myself fairly excited to see something else in our favorite little boxes attempting its Darwinian birth cycle.
Whether it succeeds or fails is up to developers (like us) to enable this technology and customers who do or don't care about it to buy it. Maybe processor speeds will eclipse market adoption because no one develops on it in time, maybe they'll work out some strategic partnership with a graphics card or motherboard company soon after its introduction, maybe someone will get the next idea for the next generation hit that we haven't seen since Wolf3D or Doom I. Dream or doubt?
I know I just spewed a lot of subjective philosophy, but the point I was meandering towards was that I would expect to see a lot more positive energy from developers on this. Personally at this point, I'll buy one when they come out, and I'll code some demos, prototypes, maybe even a game idea or several. I'm getting too old to bet the farm on them succeeding, but if I were a little younger with time/energy to burn, I'd look at the discussions right here and have some wheels turning. There are lots of problems to be solved and you have two basic paths open to you: be creative and develop some solutions that circumvent the scientific/mathematical/traditional routes, or put those math degrees/obsessions to good use.
Anyway, this should have been an editorial in a dev magazine or something....
Now where is my fully integrated desktop holodeck?
#24
09/10/2005 (1:54 am)
If consoles adopt it though, there is more than a good chance that computers will have to.
#25
09/10/2005 (4:48 am)
The 360 and PS3 don't have PPU's. The 360 uses one or two of it's 3 PPC cores to handle the physics simulation while the PS3 uses the Cell units to do the same. They DO use the physics libraries though, which CAN be used without a PPU, just like OpenGL CAN be used without an accelerated video card.
#26
As for the PPU vs. non-PPU card owners in multiplayer gaming- eh not a big deal. If you wanted to be really fair you could have separate servers for software-only and another for PPU-owners.
But does anyone do that today for Broadband vs. analog dialup gamers? No I think they just frag the dialup users and laugh at them right ;-)
09/10/2005 (11:42 pm)
I think the software lib Ageia PhySx is more interesting than the PPU chips. What if Torque eventually supported it? That would be interesting.As for the PPU vs. non-PPU card owners in multiplayer gaming- eh not a big deal. If you wanted to be really fair you could have separate servers for software-only and another for PPU-owners.
But does anyone do that today for Broadband vs. analog dialup gamers? No I think they just frag the dialup users and laugh at them right ;-)
#27
09/10/2005 (11:44 pm)
Also note that Softimage is saying it's 100 times faster than ODE- the old physics engine. OK accounting for marketing hype it is till impressive.
Torque Owner Philip Mansfield
Default Studio Name
And the little snippet above simply mentions the SDK, which could mean a software solution for the PS3, so it wouldn't be that expensive (compared to hardware and software solution).