SLI - Strange Issue
by Jeff Loveless · in Hardware Issues · 11/01/2006 (6:03 pm) · 11 replies
I noticed this issue only after combing through bios settings, drivers, and even inspecting all my pc components on the verge of booting my pc out the door.
An associate of mine, that's developing in torque as well, and I were both looking at the new TGEA demo, he told me he was getting around 200 FPS. I was naturally curious how my rig would stand up to that, so I checked and to my astonishment I was seeing 125-130 So I asked my friend, what are you running? He said a newer p4, and a geforce 6800 gt. I run a AMD X2, and dual 7800 GT's in SLI!! Processor differences and prefernces aside, I figured the 2 7800's would outclass it.
At first I figured, oh thats right the dual core issue, and when I checked, sure enough the guys at Garage Games had thought to set the programs affinity to only one core. Thats when I checked over my setup and made sure it wasnt user error.
Ready to pitch the entire thing out the window, It finally dawned on me the only thing I hadn't tried was to run it on only one GPU, in the NV control panel in the SLI options you can choose single GPU rendering... (worth mentioning at this point, multithreaded optimization got turned off as well) well interesting results, 205 fps.. not the results i was expecting from the difference between 6800 and 7800 but the fact that I gained at least 70 FPS just by ditching one of my cards and forgoing SLI??? Anyone very enlightened care to explain???
An associate of mine, that's developing in torque as well, and I were both looking at the new TGEA demo, he told me he was getting around 200 FPS. I was naturally curious how my rig would stand up to that, so I checked and to my astonishment I was seeing 125-130 So I asked my friend, what are you running? He said a newer p4, and a geforce 6800 gt. I run a AMD X2, and dual 7800 GT's in SLI!! Processor differences and prefernces aside, I figured the 2 7800's would outclass it.
At first I figured, oh thats right the dual core issue, and when I checked, sure enough the guys at Garage Games had thought to set the programs affinity to only one core. Thats when I checked over my setup and made sure it wasnt user error.
Ready to pitch the entire thing out the window, It finally dawned on me the only thing I hadn't tried was to run it on only one GPU, in the NV control panel in the SLI options you can choose single GPU rendering... (worth mentioning at this point, multithreaded optimization got turned off as well) well interesting results, 205 fps.. not the results i was expecting from the difference between 6800 and 7800 but the fact that I gained at least 70 FPS just by ditching one of my cards and forgoing SLI??? Anyone very enlightened care to explain???
#2
11/01/2006 (7:33 pm)
Funny thing is, in TSE MS4 water demo I get around 300 FPS single gpu, and not so bad of a hit in SLI 289 FPS or so. And.. Having both CPU cores enabled in the water demo actually seems to benefit because I lose 20 fps or so rather than gain fps when I disable it. (in tge, you'll get quite the perfomance boost by not letting it run on both cores)
#3
Edit: aixelsiD
11/01/2006 (8:15 pm)
Don't get me wrong I love my torque, its just that it seems to be that the dual cores and SLI/Crossfire setups are becoming more and more popular. I guess I'm fishing for someone with the gumption to search for a way to support thses features.Edit: aixelsiD
#4
11/04/2006 (7:13 am)
Anyone else running SLI?
#5
Anyway in my frustration I ended up at the developer.nvidia.com site and took a look at their latest SDK, it has some tips for good practices in programming for SLI, I'll update this thread if I learn anything from it.
As is there is a signifacant hit to perfomance when SLI is enabled. at the least it would be good if the build could make a profile for its self to disable SLI
11/04/2006 (11:14 pm)
Ok I was going to just forget about this but more experimenting leaves me with more questions. I switched from OpenGL over to D3D and in SLI mode it was rendering the top half of the screen but not the bottom half, I'm assuming it was in SLI split screen rendering mode. Anyway in my frustration I ended up at the developer.nvidia.com site and took a look at their latest SDK, it has some tips for good practices in programming for SLI, I'll update this thread if I learn anything from it.
As is there is a signifacant hit to perfomance when SLI is enabled. at the least it would be good if the build could make a profile for its self to disable SLI
#6
Not an SLI issue. TGEA issue I believe, in that they didn't right code for to talk to nvidia's drivers about SLI.
What you have to do is either read through the nvidia sdk or create a profile for TSE.EXE or w/e. Set SLI mode for Alternate Frame Rendering 1 (MS3) or Alternate Frame Rendering 2 (MS4). Split frame also works, but causes very mild artifacts on surfaces which use the backbuffer (water, cubemaps)
That's the same for my twin 7600GTs and 7900GTs. I'll tell you how it goes for twin 8800GTX in a few days.
11/09/2006 (9:49 am)
Never encountered a hit at all on 7 series hardware.Not an SLI issue. TGEA issue I believe, in that they didn't right code for to talk to nvidia's drivers about SLI.
What you have to do is either read through the nvidia sdk or create a profile for TSE.EXE or w/e. Set SLI mode for Alternate Frame Rendering 1 (MS3) or Alternate Frame Rendering 2 (MS4). Split frame also works, but causes very mild artifacts on surfaces which use the backbuffer (water, cubemaps)
That's the same for my twin 7600GTs and 7900GTs. I'll tell you how it goes for twin 8800GTX in a few days.
#7
Also, have you compared your results with the different SLI modes to your perfomance when it is set to single GPU rendering?, I've played around with different drivers and consistantly get better perfomance with SLI disabled for torque, and version/build I've seen from TGEA and TGE 1.42-1.5.
If you dont see your framerate increase when you make your Torque profile(for nvida controll panel) set for single gpu rendering, let me know what driver version your using so i can test with that version.
11/14/2006 (9:09 pm)
Jonathan, I'm very interested to hear how the 8800 series work with torque.Also, have you compared your results with the different SLI modes to your perfomance when it is set to single GPU rendering?, I've played around with different drivers and consistantly get better perfomance with SLI disabled for torque, and version/build I've seen from TGEA and TGE 1.42-1.5.
If you dont see your framerate increase when you make your Torque profile(for nvida controll panel) set for single gpu rendering, let me know what driver version your using so i can test with that version.
#8
I've noticed though that the version of the directx sdk you use does have some effect. It's the reason why alternate 1 used to be good but then suddenly sucked leaving alternate 2 as the new good one where it previously sucked. The directx sdk also (as in the case of the 8800 effect whether you get terrible artifacts in TGEAs DRL and HDR implementations.
I used to use the dhZeropoint drivers but nvidia's drivers have really caught up with the bugs in SLI so I've been using the latest versions.
What motherboard are you using? Also have you tried switching the roles of the cards (meaning which slot their in). In my 7600GT both cards run perfectly find when their alone, but if Card B is the primary SLI is rather buggy where as Card A being the primary results in very happy SLI. Both cards are identical and purchased at the same time, and their unique serial numbers vary by only 1 digit. Emphasis that they both work equally well alone, but only work together in a certain order for some reason.
Have you tried increasing the rate of the PCI-E bus? From 100 to say 110 or 115?
My specs:
Asus P5ND2-SLI
P4 Cedar Mill 3ghz OC/ed to 4.1ghz
2gb RAM
2x 7600GT
or 2x 7900GT (sold them, so I no longer have them)
or 1x 8800GTX (at last minute I wiped the 2nd as I said $1300+ isn't worth it)
The 8800 has problems in relation to the version of the directx sdk you use (I posted a bug note in the bug forums which has 1280x1024 screenshots), but other than that it's a steady 250 - 300 fps as long as you stay away from that old demo with dancing kork which usually hovers around 100fps but will drop to 60fps. DRL and HDR each only hit frame rates by about 5% estimated.
11/22/2006 (9:50 am)
No increases for single GPU rendering. Cut my framerate nearly in half running in single (figures SLI is usually 80 to 90% percent faster).I've noticed though that the version of the directx sdk you use does have some effect. It's the reason why alternate 1 used to be good but then suddenly sucked leaving alternate 2 as the new good one where it previously sucked. The directx sdk also (as in the case of the 8800 effect whether you get terrible artifacts in TGEAs DRL and HDR implementations.
I used to use the dhZeropoint drivers but nvidia's drivers have really caught up with the bugs in SLI so I've been using the latest versions.
What motherboard are you using? Also have you tried switching the roles of the cards (meaning which slot their in). In my 7600GT both cards run perfectly find when their alone, but if Card B is the primary SLI is rather buggy where as Card A being the primary results in very happy SLI. Both cards are identical and purchased at the same time, and their unique serial numbers vary by only 1 digit. Emphasis that they both work equally well alone, but only work together in a certain order for some reason.
Have you tried increasing the rate of the PCI-E bus? From 100 to say 110 or 115?
My specs:
Asus P5ND2-SLI
P4 Cedar Mill 3ghz OC/ed to 4.1ghz
2gb RAM
2x 7600GT
or 2x 7900GT (sold them, so I no longer have them)
or 1x 8800GTX (at last minute I wiped the 2nd as I said $1300+ isn't worth it)
The 8800 has problems in relation to the version of the directx sdk you use (I posted a bug note in the bug forums which has 1280x1024 screenshots), but other than that it's a steady 250 - 300 fps as long as you stay away from that old demo with dancing kork which usually hovers around 100fps but will drop to 60fps. DRL and HDR each only hit frame rates by about 5% estimated.
#9
SLI worked great in Oblivion, and Doom 3, quake 4, COD2, HL2. I just took the other card out since I'm more focused on development than gaming at the moment.
I'm trying to plan on ditching this rig and building a new system hopefully by this coming June in favor for something like an Intell 965 chipset or Nv 680i, single core, single 8800gtx, 2gb ddr2 1066, depending on what the reviews say, no more early adopting hardware for me thanks.
edit: I'll try the dhZeropoint drivers n throw the other card back in and see if I see any improvements, this board allows you to pick with card is the A card or B card from in the bios I also tried switching it that way as well.
11/27/2006 (2:50 pm)
I have the Asus A8n32-SLI-deluxe board with an amd x2 3800 latest bios, I know theres dual core issues as well, but I've been testing on this SLI problem with the CPU affinity set to one core. I hant tried changing the PCI-E bus speed, or any modified drivers, only recent WHQL release sets. I did try swapping them around with no luck. SLI worked great in Oblivion, and Doom 3, quake 4, COD2, HL2. I just took the other card out since I'm more focused on development than gaming at the moment.
I'm trying to plan on ditching this rig and building a new system hopefully by this coming June in favor for something like an Intell 965 chipset or Nv 680i, single core, single 8800gtx, 2gb ddr2 1066, depending on what the reviews say, no more early adopting hardware for me thanks.
edit: I'll try the dhZeropoint drivers n throw the other card back in and see if I see any improvements, this board allows you to pick with card is the A card or B card from in the bios I also tried switching it that way as well.
#10
In the client/prefs.cs file did you disable kill frames ahead?
Also in the nVidia control panel (using the classic view browser) (or the older thing) do you have
"direct3d settings->render frames ahead" set to anything greater than 0? It's necessary for either of the alternate frame rendering methods to work correctly.
If kill frames ahead is on and/or render frames ahead in the GPU settings is 0 it'll hurt SLI if it tries to go to alternate mode or is being forced into alternate mode because it'll still exert the drain involved in coordinating dual graphics but you won't get the benefits.
11/28/2006 (8:30 pm)
Did you try creating a game profile in the nVidia control panel (or the older thing) and manually setting the SLI rendering mode? I have not tested SLI antialiasing, but if it reacts similar to how TGEA reacts to the 8800GTX's hardware AA then it's definitely capable of making SLI run worse than single GPU rendering.In the client/prefs.cs file did you disable kill frames ahead?
Also in the nVidia control panel (using the classic view browser) (or the older thing) do you have
"direct3d settings->render frames ahead" set to anything greater than 0? It's necessary for either of the alternate frame rendering methods to work correctly.
If kill frames ahead is on and/or render frames ahead in the GPU settings is 0 it'll hurt SLI if it tries to go to alternate mode or is being forced into alternate mode because it'll still exert the drain involved in coordinating dual graphics but you won't get the benefits.
#11
Edit: Sli can take a hike, I sold my other card, it really only helped in one or two games, maybe motherboard manufacturers and graphics card manufacturers will improve upon this technology in the future but I have to say this problem was 100% hardware
11/30/2006 (8:42 pm)
Kill frames ahead? no i think i would have left that to its default setting, I'll give it a go next time I throw the other card in. I know render ahead in the nvidia settings was/is set to 3Edit: Sli can take a hike, I sold my other card, it really only helped in one or two games, maybe motherboard manufacturers and graphics card manufacturers will improve upon this technology in the future but I have to say this problem was 100% hardware
Torque Owner R.Bowers