Shocking frame rate drop in Open GL
by Andy Hawkins · in Torque Game Engine · 11/07/2005 (12:57 am) · 5 replies
When I run my demo in Direct3D in Torque it screams along, but when I run it in OpenGL in Torque the frame rate drops considerably.
Is this a texture size issue? Some of my textures are 256x256 or 512x512.
I'm stumped. It also seems to depend on which Opengl compatible card I'm using. A GeForce FX 5950 has a massive frame rate drop (30 frames per second) while on a Radeon 9600 it's not too bad (10 frames per second) in OpenGL.
Is this a texture size issue? Some of my textures are 256x256 or 512x512.
I'm stumped. It also seems to depend on which Opengl compatible card I'm using. A GeForce FX 5950 has a massive frame rate drop (30 frames per second) while on a Radeon 9600 it's not too bad (10 frames per second) in OpenGL.
#2
Are you sure you don't have something like FSAA forced high in one case and not in another? Many drivers let you specify this on a per-API basis.
11/07/2005 (2:12 am)
Typically indicates some driver weirdness. The DX wrapper should be all slower in all "normal' cases (ie, same FSAA settings, working drivers in both cases, no major issues going on), because it does more work and more hits to the driver in many cases.Are you sure you don't have something like FSAA forced high in one case and not in another? Many drivers let you specify this on a per-API basis.
#3
So I think what I need to do is get the frames per second every now and then and drop some of the details of objects down in realtime to cope - this way it won't matter what card the user owns.
Is there a function in Torque to detect the frames per second? Is it possible to force a lower LOD on a DTS object in realtime?
ADDED
I checked the FSAA (FS -something- Anti Aliasing I'm assuming) and they are all on API controlled for both Opengl and D3D.
11/07/2005 (2:33 am)
After a few tests by changing the settings in the driver for OpenGl and D3D on a ATI Radeon 9600 - the OpenGL rendered faster even at high quality settings and high performance. The D3D settings marginally improved the frame rate by dropping the drivers down to high performance instead of high quality. So OpenGL was faster all round - on the GeForce the situation is reversed.So I think what I need to do is get the frames per second every now and then and drop some of the details of objects down in realtime to cope - this way it won't matter what card the user owns.
Is there a function in Torque to detect the frames per second? Is it possible to force a lower LOD on a DTS object in realtime?
ADDED
I checked the FSAA (FS -something- Anti Aliasing I'm assuming) and they are all on API controlled for both Opengl and D3D.
#4
in script, use (just open the console and type if you want):
metrics(fps);
11/09/2005 (11:17 am)
Quote:Is there a function in Torque to detect the frames per second?
in script, use (just open the console and type if you want):
metrics(fps);
#5
11/09/2005 (1:49 pm)
Got it thanks :)
Torque Owner Dreamer
Default Studio Name