Major GeForce 6600 GT performance issues
by Tom Spilman · in Torque Game Engine Advanced · 05/24/2005 (1:53 pm) · 45 replies
So we have a map that performs beautifully on an ATI 9800 Pro, but is dog slow on my GeForce 6600 GT. I'm getting 2-3 fps as reported by metrics(fps) and NVPerfHud while on the 9800 Pro we get 80-100fps.
So i started by trying to force down the pixel shader level. I tried forcing it to 2.0, 1.4. 1.1... all of them seem to give me the same performance. I did this by changing the prefs to:
Which seems to work as the console reports it's forcing shaders down... but i can't be sure it's not lying to me. =)
Next i fired off NVPerfHud. It shows a tremendous amount of time is being spent in the driver. When i use the T key to force 2x2 textures performance comes up to a respectable frame rate. So according to NVidia's docs that means i'm limited by texture bandwidth. Now i do have alot of large textures in the scene... but the 9800 seems not to mind.
So the 6600 which is supposed to be a better performing card than the 9800 Pro has a texture bandwidth issue while the 9800 does not. That seems illogical to me.
I suspect one of the shaders is the culprit here. Anyone have any tips to help diagnose this problem?
So i started by trying to force down the pixel shader level. I tried forcing it to 2.0, 1.4. 1.1... all of them seem to give me the same performance. I did this by changing the prefs to:
$pref::Video::forcedPixVersion = 1.1; $pref::Video::forcePixVersion = 1;
Which seems to work as the console reports it's forcing shaders down... but i can't be sure it's not lying to me. =)
Next i fired off NVPerfHud. It shows a tremendous amount of time is being spent in the driver. When i use the T key to force 2x2 textures performance comes up to a respectable frame rate. So according to NVidia's docs that means i'm limited by texture bandwidth. Now i do have alot of large textures in the scene... but the 9800 seems not to mind.
So the 6600 which is supposed to be a better performing card than the 9800 Pro has a texture bandwidth issue while the 9800 does not. That seems illogical to me.
I suspect one of the shaders is the culprit here. Anyone have any tips to help diagnose this problem?
About the author
Tom is a programmer and co-owner of Sickhead Games, LLC.
#42
Thanks.
08/27/2007 (10:04 am)
Welcome back to an old thread! Not to wake a sleeping dog here, but we're experiencing a problem today with the current release of TGEA. Our current environment runs fine on most cards in the office, including as far down as the GeForce Go 7800 in a laptop, but our frame rates are absurdly slow running the same environment specifically on the Quadro FX 550, which should be a better card. Is there just no support for the Quadro's specific GPU in TGEA? We can't seem to figure out any other reason for this.Thanks.
#43
Rendering cards sacrifice speed and speed optimized drivers for higher precision and precision optimized drivers.
During GF5-GF6, that was no problem as the used chips were +- identical. But from GF7 onward, Nvidia split them appart again to get more speed (to compete with ATI), which sadly means kicking out the high precision stuff.
On the other hand, the CAD / Rendering stuff heavily builds on that units, so there other components mainly needed for realtime highspeed rendering were kicked out.
the same stuff happend to the drivers.
08/27/2007 (10:28 am)
Quadro FX is meant to be slower than gaming cards.Rendering cards sacrifice speed and speed optimized drivers for higher precision and precision optimized drivers.
During GF5-GF6, that was no problem as the used chips were +- identical. But from GF7 onward, Nvidia split them appart again to get more speed (to compete with ATI), which sadly means kicking out the high precision stuff.
On the other hand, the CAD / Rendering stuff heavily builds on that units, so there other components mainly needed for realtime highspeed rendering were kicked out.
the same stuff happend to the drivers.
#44
Dell Precision PWS390
IntelCore2CPU
6600@2.4GHz
2.93GHz2.GB of RAM
NVIDIA Quadro FX 3450/4000 SDI
We had the same problem at another game company, also using Dell PCs, I'm guessing it's the same problem, they'd all have been using Quadros I think, although that was when our game was in 1.4, before we ported to TGEA. You'd have one PC running fine, then another Dell one next to it chugging away etc. When usually it usualy works fine even on Laptops etc. etc.....it was the first time we'd ever seen any massive problem with our framerate...
If this is all true, that kind of makes me feel a bit better as you usual punter will be going for a different card than these presumably.... I wasn't sure where to point the finger though, Graphics card, Duel-Core, sound card etc. etc......
H.
08/29/2007 (5:43 am)
Just to let you all know, We're running our game no problem on ATI cards, even my X700 has no trouble to speak of, but when we presented our game to a game publisher for the first time, it was Unplayable on one of their high end PCs, I noted down the spec. it was a:Dell Precision PWS390
IntelCore2CPU
6600@2.4GHz
2.93GHz2.GB of RAM
NVIDIA Quadro FX 3450/4000 SDI
We had the same problem at another game company, also using Dell PCs, I'm guessing it's the same problem, they'd all have been using Quadros I think, although that was when our game was in 1.4, before we ported to TGEA. You'd have one PC running fine, then another Dell one next to it chugging away etc. When usually it usualy works fine even on Laptops etc. etc.....it was the first time we'd ever seen any massive problem with our framerate...
If this is all true, that kind of makes me feel a bit better as you usual punter will be going for a different card than these presumably.... I wasn't sure where to point the finger though, Graphics card, Duel-Core, sound card etc. etc......
H.
#45
The other thing you point out sadly seems to be still true as well: TGEA runs better on ATI than on NVIDIA.
There are a few threads on that, one with nvperf analyze on this issue.
No idea how this actually happened as the ATI shader support for more powerfull operations always lacked far behind NVidia ... (vertex texture fetch missing until X1300+ and the like)
08/30/2007 (3:08 am)
Yes, using a CAD card will lower the frame rate than with a "same series" gamer oriented card, especially on shader intense / pure shader stuff as TGEA.The other thing you point out sadly seems to be still true as well: TGEA runs better on ATI than on NVIDIA.
There are a few threads on that, one with nvperf analyze on this issue.
No idea how this actually happened as the ATI shader support for more powerfull operations always lacked far behind NVidia ... (vertex texture fetch missing until X1300+ and the like)
Sim Ops Studios (#0002)
Default Studio Name
Thanks.