Game Development Community

Intel 82945g chipset integrated video?

by Arcanor · in Torque Game Engine Advanced · 11/30/2007 (7:04 am) · 9 replies

I'm trying to deploy my TGEA-based game to a computer with video based on the Intel 82945g express chipset (integrated on the motherboard). This chipset supports pixel shader 2.0 and vertex shader 3.0, as indicated in this snippet from the chipset datasheet:

High Performance 3D  	
	Up to 4 pixels per clock rendering
	Microsoft* DirectX* 9 Hardware Acceleration Features:
	Pixel Shader 2.0
	Volumetric Textures
	Shadow Maps
	Slope Scale Depth Bias
	Two-Sided Stencil
	Microsoft* DirectX* 9 Vertex Shader 3.0 and Transform and Lighting supported in software through highly optimized Processor Specific Geometry Pipeline (PSGP)
	Texture Decompression for DirectX* and OpenGL*
	OpenGL* 1.4 support plus ARB_vertex_buffer and EXT_shadow_funcs extensions and TexEnv shader caching

However, whenever the user tries to start the TGEA demo it's giving an error "Failed to initialize direct3d". We have updated the chipset graphics drivers, and directX drivers. What am I missing?

Is it even possible to run TGEA on an integrated Intel 82945g video setup?

#1
11/30/2007 (7:06 am)
Intel Onboard will not work.
While they have the certification, they only purpose is "Vista premium" capability but not general shader driven gaming I fear.
#2
11/30/2007 (7:09 am)
Yes, it is possible. You just need to modify the D3D initialization to not ask for hardware vertex processing. TGEA really should check for this and fallback... there are LOTS of these video boards around.
#3
11/30/2007 (7:22 am)
@Marc Schaerer: I feared as much. Thanks for the confirmation.

@Prairie Games: Thanks for this valuable tip. Can you offer any clue as to how I would do this? I've searched and didn't find any existing resources that dealt with implementing such a fallback, and unfortunately I'm not a graphics programming guru.
#4
11/30/2007 (7:40 am)
Is this as simple as going to gfxD3DDevice.cpp, init(), and changing:
deviceFlags = D3DCREATE_HARDWARE_VERTEXPROCESSING;

to:
deviceFlags = D3DCREATE_MIXED_VERTEXPROCESSING;

Will this slow down performance on hardware that DOES support hardware vertex shaders?
#5
11/30/2007 (10:20 am)
Yes as it does not go into pure hardware but can use both.

I would get the caps and init it correctly.
here is an informative thread on that:
http://www.gamedev.net/community/forums/topic.asp?topic_id=348335&whichpage=1�
#6
12/01/2007 (3:56 am)
Thanks for the tip Marc.

What I'm seeing is that the performance is unusably slow when in MIXED mode, i.e. 100 to 300 times slower. Even on the splash screen with no 3D objects at all being rendered I'm seeing frame rates of one frame every two or three SECONDS.

Is there any way to improve the software vertex processing speed when using D3DCREATE_MIXED_VERTEXPROCESSING?
#7
12/02/2007 (6:37 am)
No, not without writting support for pure fixed pipeline (and T&L) into the GFX Driver
#8
12/02/2007 (8:05 am)
Check up on the intel docs for this... I believe you want to specify software vertex processing to the driver in this case. The TGEA demo ran ok on my Mac-Mini (Boot Camped to Vista) which uses an integrated intel video board. The boards suck, but they aren't THAT bad... and you really don't want to close out this market from buying your game.
#9
12/02/2007 (10:12 am)
The demo does not use much at all, so its not hard to not get it down.
The demo runs at 140-280FPS on my 8800GTS, 2x3ghz, 2GB RAM system while a simple outdoor scene with legacy and the foliage replicator a few difs + water does not get above 50 FPS

So I would not assume that software vertex processing will end in anything higher than 5 FPS at best.
Truevision3D definitely is much more desireable as rendering solution if the onboard market is a desired part of the target market ... (runs a scene similar to the MS4.2 demo on an intel GMA900 as TGEA did back then on above mentioned system!)