Performance issues.
by Bradley Newton Haug · in Torque Game Engine · 02/18/2003 (7:18 pm) · 36 replies
OpenGL : Realmwars release build map 'more terrain' : 23-25 fps in the middle of town
DirectX 9 release dlls : 4-10 fps
OpenGL : new map 45-50 fps
DirectX 9 : new map 20-25 fps
OpenGL: Unreal 2003 85-100fps give or take no matter where.
and oddly
Tribes2: OpenGL I get about 50fps constant.
platform : windows
card : radeon 9700, other box has a ti6400 g4, I get about 10 fps less per test on it. Machine is an xp 2600.
-brad
DirectX 9 release dlls : 4-10 fps
OpenGL : new map 45-50 fps
DirectX 9 : new map 20-25 fps
OpenGL: Unreal 2003 85-100fps give or take no matter where.
and oddly
Tribes2: OpenGL I get about 50fps constant.
platform : windows
card : radeon 9700, other box has a ti6400 g4, I get about 10 fps less per test on it. Machine is an xp 2600.
-brad
About the author
#22
02/19/2003 (12:10 pm)
Pixel shaders definately exist 'as a standard' in openGL. ARB_fragment_program.
#23
I missed the fragment program hoopla apparantly, I didn't realize they had gotten around to pixel shading. Seeing that vertex still hadn't made it to implementation I suppose I didn't pay any attention.
-brad
edit: changed cross platform to cross company
02/19/2003 (12:35 pm)
Show me one opengl program using those extensions that works with ATI and nvidia. I would say it's ATI's fault, but I don't really care, the reality of the situation is that there is no cross company shading solution for opengl as of today. If I am wrong, provide me with a link to source, every single one that I've seen uses ATI_ and NV ext, according to both nvidia and ati they are now in their drivers (the new arb ext) (that wasn't the case last time I checked).I missed the fragment program hoopla apparantly, I didn't realize they had gotten around to pixel shading. Seeing that vertex still hadn't made it to implementation I suppose I didn't pay any attention.
-brad
edit: changed cross platform to cross company
#25
thanks though
-brad
02/19/2003 (3:00 pm)
the vertex programs on that page use cg, cg opengl shaders do not work under ATI. thanks though
-brad
#26
--KallDrexx
02/19/2003 (3:06 pm)
uh you know there are ARB extenstions for fragment and vertex programs right?--KallDrexx
#27
-brad
02/19/2003 (3:12 pm)
Has something drastic happened since december when you had to use NV_vertex_program? Press Releases aren't implementation. I want to be wrong, but I am under the impression that you can't use the ARB vertex ext on a radeon.-brad
#28
2) Cg compiles to ARB_vertex_program and ARB_fragment_program targets - which work on both ATI and nVidia chipsets.
3) Check the example on that page titled: Simple Vertex & Pixel Shader (ARB_vertex_program, ARB_fragment_program)
02/19/2003 (3:40 pm)
1) You can use both ARB extenstions on both Radeon and Geforces2) Cg compiles to ARB_vertex_program and ARB_fragment_program targets - which work on both ATI and nVidia chipsets.
3) Check the example on that page titled: Simple Vertex & Pixel Shader (ARB_vertex_program, ARB_fragment_program)
#29
On the system I listed above, I get 100 fps in OpenGL, using D3D I get about 30fps.
:)
Dont want you to think I was getting 100 fps in D3D.
02/19/2003 (3:41 pm)
I need to clarify something.On the system I listed above, I get 100 fps in OpenGL, using D3D I get about 30fps.
:)
Dont want you to think I was getting 100 fps in D3D.
#30
-brad
02/19/2003 (3:53 pm)
Mark, my current windows opengl info does not list those arb extensions, neither do the (around the same date) linux drivers. There might be beta drivers I don't know about. As it stands CG installed (1.0) on my windows ati machine only works for directx8 and 9, not opengl. Up till 1.0 release it compiled to nv ext only, it claims to compile to arb version 1.1.-brad
#31
ARB_fragment_program is only supported in hardware on Geforce FX and up and Radeon 9500 and up.
If you have access to an nVidia card, you can play with both on lesser hardware with the NV30 emulation mode.
02/19/2003 (4:02 pm)
ARB_vertex_program is supported in hardware on Geforce 3 and up, and Radeon 8500 and up. You'll need the very latest drivers for ATI. I *think* the latest end user ATI driver has this exposed. I haven't checked, though. If not, snag one from their developer site.ARB_fragment_program is only supported in hardware on Geforce FX and up and Radeon 9500 and up.
If you have access to an nVidia card, you can play with both on lesser hardware with the NV30 emulation mode.
#32
cg has no profile target on the ati for pixel shaders, hence my original comment 'there are no pixel shaders in opengl'. I should have said 'solution'. I've used the vertex profile but it's very very very buggy.
If I can figure out whats up with the arb ext (probably drivers), I know that in theory it's possible to manually use GPU code, but that isn't an 'art path' solution, I still haven't found an example of this, every shader example I see uses cg or dx9 fx or dx8. I've never seen a native opengl one that didn't use NV ext instead of ARB (even the one you pointed out uses cg as a compiler/runtime).
Sorry for being snippy, I've gotten a number of responses to this, I even had one guy on gamasutra tell me to 'use glslang, duh'. heh.
regards
brad
02/19/2003 (4:19 pm)
yeah I have a radeon 9700 and on the nvidia end it's a g4 I use the 'next-gen' emu on the nvidia and well of course everything works under cg, being that it's nvidia... heh meanwhile on the ati box I'm running next gen shaders without emu =Dcg has no profile target on the ati for pixel shaders, hence my original comment 'there are no pixel shaders in opengl'. I should have said 'solution'. I've used the vertex profile but it's very very very buggy.
If I can figure out whats up with the arb ext (probably drivers), I know that in theory it's possible to manually use GPU code, but that isn't an 'art path' solution, I still haven't found an example of this, every shader example I see uses cg or dx9 fx or dx8. I've never seen a native opengl one that didn't use NV ext instead of ARB (even the one you pointed out uses cg as a compiler/runtime).
Sorry for being snippy, I've gotten a number of responses to this, I even had one guy on gamasutra tell me to 'use glslang, duh'. heh.
regards
brad
#33
oh and also, the example on codesampler.com contains no references to, or runtime of CG. I stated earlier that it did.
-brad
02/19/2003 (9:08 pm)
I installed newer drivers I found, and verified that the arb vertex and pixel shaders work, they did not work with my november dated drivers, and still don't work for my linux drivers, however. I apologize for my statement that this hadn't been implemented by ATI. I maintain however that there is no HLSL available for a cross platform solution as CG only has a profile targeting the ATI for vetex shaders.. which is frustrating.oh and also, the example on codesampler.com contains no references to, or runtime of CG. I stated earlier that it did.
-brad
#34
02/20/2003 (5:28 am)
Bradley, look again. Cg 1.0 absolutely has profiles for arb vertex and fragment programs. Maybe you're using an older release?
#35
There are times when I wish I was writing console applications.
02/20/2003 (7:57 am)
Just to keep this thread away from the original subject of performance, I'd like to point out that NVIDIA's ARB extension support isn't always that good when there is a comparable NV extension. For example, on the NV30 I'm seeing problems (usually all white materials) when compiling Cg vertex shaders to the ARB profile that disappear when I use the NV profile. Just to add annoyance, it works fine on NV25 hardware.There are times when I wish I was writing console applications.
#36
Our team actually ported Torque to DX9 -- still using the gl2d3d, but it has dx9 code in there now... However, the gl2d3d layer is only there for parts of the code that we haven't gotten around to writing real d3d render routines (which are called in renderObject if the displaydriver is d3d), or wherever it wasn't showing real necessity. Right now, very few objects have real d3d render routines written in (basically TSMesh(done, w/ shaders), Sky(done), and Terrain(in progress)).
Surprise, surprise, the D3D rendering is still slow as all hell. In fact, in a level having no terrain, no objects, and only a skybox, the D3D version still gets only 60% of the framerate (on an NV28, around 75% on an R300).
06/02/2003 (11:33 am)
As far as that Gl2d3d layer... it's actually not that big a deal. Profiling of opengl2d3d is showing that it performs slightly faster than opengl32.dll. Which is only semi-surprising considering that the opengl32.dll is a more complete GL 1.xx implementation. Besides which, in Windows, DirectX is at the lowest level, so your OS has its own opengl2d3d layer in there.Our team actually ported Torque to DX9 -- still using the gl2d3d, but it has dx9 code in there now... However, the gl2d3d layer is only there for parts of the code that we haven't gotten around to writing real d3d render routines (which are called in renderObject if the displaydriver is d3d), or wherever it wasn't showing real necessity. Right now, very few objects have real d3d render routines written in (basically TSMesh(done, w/ shaders), Sky(done), and Terrain(in progress)).
Surprise, surprise, the D3D rendering is still slow as all hell. In fact, in a level having no terrain, no objects, and only a skybox, the D3D version still gets only 60% of the framerate (on an NV28, around 75% on an R300).
Torque Owner Bradley Newton Haug
also pixel shaders don't currently exist as a standard, the new extension is for vertex shaders only. ATI and NV both have their own version of several things like pixel shaders, etc. NV at least incorporates it's extensions into the standard extention system, ATI gives you headers and a lib with it's name all over them.
other than that it's dandy =D
-brad