Game Development Community

Does T3D make use of SLI setups?

by Netwyrm · in Hardware Issues · 09/17/2014 (11:28 am) · 5 replies

I need to take some nicer screenshots, and so am contemplating an upgrade to the nVidia 780 card(s).

I've been "SLI-capable" for the last two years, but really haven't noticed any performance difference whether I toggle it on or off. These days the extra card just sits in the case.

To be fair, SLI doesn't seem to have much effect on heavily-networked multiplayer, or at least none of the games I've played have been able to take advantage of it (MUDs and MMOs). It's not just Torque, it's the genre I sometimes think... enchanted worlds of fantasy that ALL run at 12 fps :)

I wanted to ask--is SLI considered worth it for working with Torque?

My game is built off the T3D 1.1 code platform--if Torque usually sees a big performance boost from SLI (or Crossfire), I have been doing something wrong. I'm okay with it being me at fault, but I'd sure like to know what the consensus is before I blow $500 on a second card and have it also just sit there.

About the author

My adventures in T3D are chronicled at http://www.worldofantra.com. Please be aware the subject is sword-and-sorcery, and the occasional bloody or bare body part may be in scope.


#1
09/17/2014 (1:28 pm)
The way I've always understood it, SLI is handled at the driver level - the game shouldn't even care if you have one, two, or sixty-five video cards. The driver is supposed to handle how the workload is distributed to the hardware. I might be wrong, though.
#2
09/17/2014 (4:02 pm)
@ Richard Ranft,

Well, I'm looking at it. I'm not going to pretend I understand a lot of it, but some of the interesting things I have found so far include an nVidia document that talks about SLI profiles, and specific modes for intercard operation, which would lead me to think there would have to be something from the game that described for the card how that application should render. Because "I need this mode" or "I need that mode" seems like the kind of thing an application would need to tell the hardware to do.

Can't say, and I sure can't generate the code involved, but it sure is interesting.

nVidia has a list of SLI-enabled games on their site. I have D&D Online: Stormreach installed, so I am going to run around for a bit with SLI on and SLI off and see if there is any difference I can notice.

The nVidia doc I referenced above (developer.download.nvidia.com/whitepapers/2011/SLI_Best_Practices_2011_Feb.pdf) makes some mention of "if you name your app suchandsuch.exe it will invoke this SLI method", which if it works when I get to it later will certainly be revealing. It may be BS though, the white paper is from 2011, but I'll try it anyway. If there is any perceptible difference it will be another breadcrumb towards doing it deliberately.
#3
09/17/2014 (4:26 pm)
Ah - if it's on a Windows machine then the Nvidia Experience software can set "optimal modes" for various games, so maybe they handle some of that too. I know that Nvidia and ATI used to have control panel applets that let you specify "game" and "desktop" settings, including how antialiasing, multisampling, and other fairly low level things are handled.

I think it's interesting that they want you to name your application something specific.

With all of that, I'm pretty sure that Torque completely ignores that stuff.... lol.

Honestly, it would be better if Nvidia handled that transparently through the driver - why should I have to tell the thing I want it to use SLI? All that does is split the work between the cards - why does the application care? It just feels too much like a manufacturer saying "look how fast our cards are! If you want to use that you have to jump through these vague and often hidden hoops!" I'm not sure why we, as developers, stand for that from manufacturers. I think I'll put a splash screen in all of my games saying that "this game doesn't support SLI because your video card manufacturer is lazy."

Also - I'm not sure what the problem might be, but I've run WoW, SWTOR, ESO, Wildstar, and TSW all well over 20 fps (usually in the 35~65 fps range) without using SLI - just straight single card setup, and max or near max settings. How old is that motherboard, man?
#4
09/17/2014 (4:44 pm)
Oh, oh, I see what they've done. It's no longer "Scan-Line Interleave," it's "Scalable Link Interface." The "default AFR mode" (Alternate Frame Rendering) seems like it wouldn't do much for you - actually, they say
Quote:
One of the most common modes is AFR. By default, when AFR mode is forcefully enabled for a
given application using the NVIDA control panel, the driver has to allow all inter-GPU
synchronization and communication required to handle inter-frame dependencies and guarantee the correctness of the results. This typically will lead to no SLI performance scaling.
Also, you don't gain any video memory -
Quote:
In all SLI-rendering modes all the graphics API resources (such as buffers or textures) that would normally be expected to be placed in GPU memory are automatically replicated in the memory of all the GPUs in the SLI configuration. This means that on an SLI system with two 512MB video cards,
there is still only 512MB of onboard video memory available to the application.
Boo. This also apparently requires the two cards to "sync" between frames, because the Render to Target section mentions stalling while waiting for updates from the other GPU. Terrible.

I see what they were getting at with the application name - it's a "cheat mode" to try to test out your maximum optimization potential at the risk of introducing "rendering artifacts."
#5
09/17/2014 (7:31 pm)
And the cheat mode works, too... although it results in the terrain rendering every other frame from a different angle, which is a helluva "artifact".

I ran D&D Online with and without SLI enabled. What it looked like at a very quick glance was that with SLI, it ran about 15 fps (@50-65) more slowly, but the graphics were vastly better shaded and richer with an improved sense of distance, while without SLI it ran about 10 fps faster (@70-75) but with flattened textured and much less lighting detail.

That's making no other adjustments between the two modes at 1920x1200. But there does appear to be some perceptible difference when available for that game.