Intel Core 2 Duo Pixel Shader 3.0 Support?.
by Surge · in Torque Game Engine Advanced · 01/12/2007 (4:50 am) · 23 replies
Good day folks,
My 1.7 GHZ P4 that i'd been doing all my work on forever bit the dust.
I have replaced it now.
-------------
Intel DG965WH Mother Board
1.86GHz Core 2 Duo Processor.
2 Gigs Of Ram.
It has on board Graphics(1st time for me actually)
Intel G965 Express Chipset
-------------
When I go to the Intel site and look up this board.
It seems to me that its on board graphics should support
Pixel Shader 2.0 - 3.0.
I have installed the Latest Direct X 9.0c.
First was the June 2006 version, and now the December 2006 Update
I went to the 'contol Panel' and Double clicked the Direct X Icon
and ran the test.It ran through Direct X 7,8,9 and all passed.
So I didnt think it was the Direct X.
Beyond the CD that came with my mother board, I searched the
Intel site, and found one update to the Video driver, so I Installed it.
When ever I boot my TGEA(TSE) Build's, it now says.
Failed to Intialize Direct3D! Make sure you have the Direct 9 Installed,
and are running a card that supports Pixel Shader 1.1.
Can I get some advice.
Would anyone be able to say if it was the On board graphics card?
Or would you say that I have something wrong with my Direct X Install?
Is it necessary to purcase a New video card?
Or Should my new Intel Mother board be able to handle the TGEA(TSE) Builds?
Just to add to this, even the TGE builds seem a little choppy?
I switch from OPEN GL to D3D and It still makes the screen choppy?
Graphics Card?..... Drivers?
I miss my TSE!
-Surge
My 1.7 GHZ P4 that i'd been doing all my work on forever bit the dust.
I have replaced it now.
-------------
Intel DG965WH Mother Board
1.86GHz Core 2 Duo Processor.
2 Gigs Of Ram.
It has on board Graphics(1st time for me actually)
Intel G965 Express Chipset
-------------
When I go to the Intel site and look up this board.
It seems to me that its on board graphics should support
Pixel Shader 2.0 - 3.0.
I have installed the Latest Direct X 9.0c.
First was the June 2006 version, and now the December 2006 Update
I went to the 'contol Panel' and Double clicked the Direct X Icon
and ran the test.It ran through Direct X 7,8,9 and all passed.
So I didnt think it was the Direct X.
Beyond the CD that came with my mother board, I searched the
Intel site, and found one update to the Video driver, so I Installed it.
When ever I boot my TGEA(TSE) Build's, it now says.
Failed to Intialize Direct3D! Make sure you have the Direct 9 Installed,
and are running a card that supports Pixel Shader 1.1.
Can I get some advice.
Would anyone be able to say if it was the On board graphics card?
Or would you say that I have something wrong with my Direct X Install?
Is it necessary to purcase a New video card?
Or Should my new Intel Mother board be able to handle the TGEA(TSE) Builds?
Just to add to this, even the TGE builds seem a little choppy?
I switch from OPEN GL to D3D and It still makes the screen choppy?
Graphics Card?..... Drivers?
I miss my TSE!
-Surge
About the author
Love 3d Game Design -
#2
Based on the G965, you should in fact have PS 3.0, though it says its X3000 or something, so maybe its drivers related.
Also, December 2006 update has some issues within the sdk, and I would expect the runtime to as well.
01/12/2007 (10:57 am)
First, TGE D3D is crappy so its choppiness isn't surprising :)Based on the G965, you should in fact have PS 3.0, though it says its X3000 or something, so maybe its drivers related.
Also, December 2006 update has some issues within the sdk, and I would expect the runtime to as well.
#3
01/12/2007 (12:27 pm)
Ok, so I got a chance to look around a bit more and the problem I was mentioning earlier shouldn't exist for the G965 according to this: www.intel.com/support/graphics/sb/CS-011910.htm. The MacBook that I was playing with has a GMA 950 (this Wikipedia link shows how the GMA and G schemes inter-relate) which doesn't support hardware vertex shading.
#4
@Jeremiah,
I had assumed having PS.3.0 support meant I would beable to run my TSE Builds.
I think the onboard wont suppot Shader 1.x, if that makes any sense at all.?
@ Jon
Thanks for the Link and the WIKI info.
I had been all over the Intel site trying to this piece this together.
So even though it says it has PS 3.0 Support its not actually avaible yet?
And the support is for Shader 2.0 - 3.0 and has none for 1.x, am I getting that correct?
So Strange eh.
It seems as if Ill be out buying a video card this weekend.
Thanks
-Surge
01/12/2007 (1:12 pm)
Thanks for the info Guys.@Jeremiah,
I had assumed having PS.3.0 support meant I would beable to run my TSE Builds.
I think the onboard wont suppot Shader 1.x, if that makes any sense at all.?
@ Jon
Thanks for the Link and the WIKI info.
I had been all over the Intel site trying to this piece this together.
So even though it says it has PS 3.0 Support its not actually avaible yet?
And the support is for Shader 2.0 - 3.0 and has none for 1.x, am I getting that correct?
So Strange eh.
It seems as if Ill be out buying a video card this weekend.
Thanks
-Surge
#5
01/12/2007 (1:19 pm)
Never got my hands on one of those cards, but I think you cannot create a pure D3D device on them (that's what TSE release builds try to do) since they'll fail to report VS support and hardware T&L. Not sure what changes are involved in getting TSE running on those.
#6
The GMA 3000 and the GMA x3000.. The x3000 is fully SM3.0 compliant, whereas the 3000 is not with no T&L and VS hardware support.
byte.gdamn.com/DXCapsViewer.exe
Hopefully that will work for you. It is the Caps Viewer that comes with the DirectX SDK, and you can actually see the various specifications that are being reported.
If you go under DirectX Graphics Adapters -> Your Card -> D3D Device Types -> HAL -> Caps, you will find most of the general information with the shaders
01/12/2007 (3:04 pm)
Hmm, it actually looks like there are "models" for the G965 video, as noted by the Wiki link Jon provided.The GMA 3000 and the GMA x3000.. The x3000 is fully SM3.0 compliant, whereas the 3000 is not with no T&L and VS hardware support.
byte.gdamn.com/DXCapsViewer.exe
Hopefully that will work for you. It is the Caps Viewer that comes with the DirectX SDK, and you can actually see the various specifications that are being reported.
If you go under DirectX Graphics Adapters -> Your Card -> D3D Device Types -> HAL -> Caps, you will find most of the general information with the shaders
#7
Like Usual, its 'drivers', and from what I can tell they havn't been written yet.
The current drivers that seem to be available are....
14.24 which have zero(0) support for T&L PS, or VS.!?!
14.25 will still have no T&L support (what?!?), But I got my hands on a Beta Build- TGE works Better.
14.26 will have support for PS VS and T&L.(Not yet made)
14.27 Will have Full hardware support.(Which could be end of February.)
I found a few techy discussion on it which led me onto other links.
Ive never been an advocate of on board video, butI was interested, in seeing what was out there.
It's interesting for me, because typically, I would have bought an ATI card.
I managed to find the 14.25 which fixed my TGE problem.So TGE smoothly running now.
So I might just wait, for February, and if things dont change. Ill grab a new card.
I have 3ds max work i could do in the mean time.
This has helped clear things up, at this end.
Thanks,
-Surge
01/12/2007 (4:35 pm)
O.k I understand a little more,Like Usual, its 'drivers', and from what I can tell they havn't been written yet.
The current drivers that seem to be available are....
14.24 which have zero(0) support for T&L PS, or VS.!?!
14.25 will still have no T&L support (what?!?), But I got my hands on a Beta Build- TGE works Better.
14.26 will have support for PS VS and T&L.(Not yet made)
14.27 Will have Full hardware support.(Which could be end of February.)
I found a few techy discussion on it which led me onto other links.
Ive never been an advocate of on board video, butI was interested, in seeing what was out there.
It's interesting for me, because typically, I would have bought an ATI card.
I managed to find the 14.25 which fixed my TGE problem.So TGE smoothly running now.
So I might just wait, for February, and if things dont change. Ill grab a new card.
I have 3ds max work i could do in the mean time.
This has helped clear things up, at this end.
Thanks,
-Surge
#8
Aw well too bad it was not the HW model "X".
01/12/2007 (4:41 pm)
Does it have a PCIE x16 slot for vid? This is that new chipset made for Micro ATX motherboards.Aw well too bad it was not the HW model "X".
#9
I also like the High Definition Audio for 7.1 Surround Sound.
Now if only all the drivers where done for it, id be set.
01/13/2007 (12:40 pm)
Yes it came with a PCI Express x16 Connector.I also like the High Definition Audio for 7.1 Surround Sound.
Now if only all the drivers where done for it, id be set.
#10
// Depending on the SDK update you are using, there will be a different version
// of the DLL that is required. December 2005 is d3dx9_28.dll
#define D3DXDLL_VER 28
Look in the windows system32 folder and see if you have a file called:
d3dx9_28.dll
If not, you need to install a different version of the DirectX 9 runtime. You should be able to install mutliple versions, and things will just sort themselves out.
This is why most commercial games ship with an installer that installs DirectX wether you ask them to or not. They need to be sure you have the exact version they were written to work with.
Good luck!
02/26/2007 (5:46 pm)
// Required D3DX9 DLL Version// Depending on the SDK update you are using, there will be a different version
// of the DLL that is required. December 2005 is d3dx9_28.dll
#define D3DXDLL_VER 28
Look in the windows system32 folder and see if you have a file called:
d3dx9_28.dll
If not, you need to install a different version of the DirectX 9 runtime. You should be able to install mutliple versions, and things will just sort themselves out.
This is why most commercial games ship with an installer that installs DirectX wether you ask them to or not. They need to be sure you have the exact version they were written to work with.
Good luck!
#11
www.garagegames.com/mg/forums/result.thread.php?qt=58098
Dave.
02/27/2007 (3:31 am)
Check out this post, if it's TGEA related it might solve your problem with px1. If it's more generic then it won't.www.garagegames.com/mg/forums/result.thread.php?qt=58098
Dave.
#12
http://www.garagegames.com/mg/forums/result.thread.php?qt=56054
03/10/2007 (5:34 pm)
I have onboard graphics and posted a question a little while ago. I tried to rewrite some of the graphics initializations to work with it but no luck so far.http://www.garagegames.com/mg/forums/result.thread.php?qt=56054
#13
03/10/2007 (5:41 pm)
Oh by the way the Vertex shaders work fine and check out everytime. Its the pixel shaders that bail on me in the engine. I ended up writing parts to my own engine that use Pixel Shader 2.0 and Vertex Shader 2.0 and it runs fine, so I think its a stupid Vaio problem.
#14
Here's how to do it if anyone cares
void GFXD3DDevice::init( const GFXVideoMode &mode )
{
#ifdef TORQUE_SHIPPING
// Check DX Version here, bomb if we don't have the minimum.
// Check platformWin32/dxVersionChecker.cpp for more information/configuration.
AssertISV( checkDXVersion(), "Minimum DirectX version required to run this application not found. Quitting." );
#endif
// Set up the Enum translation tables
GFXD3DEnumTranslate::init();
mVideoMode = mode;
// Create D3D Presentation params
D3DPRESENT_PARAMETERS d3dpp = setupPresentParams( mode );
#ifdef TORQUE_NVPERFHUD
HRESULT hres = mD3D->CreateDevice( mD3D->GetAdapterCount() - 1, D3DDEVTYPE_REF, winState.appWindow,
D3DCREATE_MIXED_VERTEXPROCESSING | D3DCREATE_MULTITHREADED,
&d3dpp, &mD3DDevice );
#else
// Vertex processing was changed from MIXED to HARDWARE because of the switch to a pure D3D device.
// If this causes problems, please let me know: patw@garagegames.com
// Set up device flags from our compile flags.
U32 deviceFlags = 0;
D3DCAPS9 caps;
mD3D->GetDeviceCaps(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, &caps);
if( caps.DevCaps & D3DDEVCAPS_HWTRANSFORMANDLIGHT )
deviceFlags |= D3DCREATE_HARDWARE_VERTEXPROCESSING;
else
deviceFlags |= D3DCREATE_SOFTWARE_VERTEXPROCESSING;
// If pure device and HW T&L supported
if( caps.DevCaps & D3DDEVCAPS_PUREDEVICE &&
deviceFlags & D3DCREATE_HARDWARE_VERTEXPROCESSING)
deviceFlags |= D3DCREATE_PUREDEVICE;
// Add the multithread flag if we're a multithread build.
#ifdef TORQUE_MULTITHREAD
deviceFlags |= D3DCREATE_MULTITHREADED;
#endif
HRESULT hres = mD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL,
winState.appWindow, deviceFlags,
&d3dpp, &mD3DDevice );
#endif
//regenStates();
// Gracefully die if they can't give us a device.
if(!mD3DDevice)
{
Platform::AlertOK("DirectX Error!", "Failed to initialize Direct3D! Make sure you have DirectX 9 installed, and "
"are running a graphics card that supports Pixel Shader 1.1.");
Platform::forceShutdown(1);
}
// Check up on things
Con::printf(" Cur. D3DDevice ref count=%d", mD3DDevice->AddRef() - 1);
mD3DDevice->Release();
mTextureManager = new GFXD3DTextureManager( mD3DDevice );
// Now re aquire all the resources we trashed earlier
reacquireDefaultPoolResources();
// Setup default states
initStates();
//-------- Output init info ---------
mD3DDevice->GetDeviceCaps( &caps );
U8 *pxPtr = (U8*) &caps.PixelShaderVersion;
mPixVersion = pxPtr[1] + pxPtr[0] * 0.1;
Con::printf( " Pix version detected: %f", mPixVersion );
bool forcePixVersion = Con::getBoolVariable( "$pref::Video::forcePixVersion", false );
if( forcePixVersion )
{
float forcedPixVersion = Con::getFloatVariable( "$pref::Video::forcedPixVersion", mPixVersion );
if( forcedPixVersion < mPixVersion )
{
mPixVersion = forcedPixVersion;
Con::errorf( " Forced pix version: %f", mPixVersion );
}
}
U8 *vertPtr = (U8*) &caps.VertexShaderVersion;
F32 vertVersion = vertPtr[1] + vertPtr[0] * 0.1;
Con::printf( " Vert version detected: %f", vertVersion );
mCardProfiler = new GFXD3DCardProfiler();
mCardProfiler->init();
gScreenShot = new ScreenShotD3D;
}
I basically just get the caps to see if hardware T&L is supported if its not I create a Software Vertex processing device. Then If hardware T & L is supported I create a pure device so it will work with better computers. Hope this helps
Adam
03/11/2007 (8:25 am)
Alright I finally got it to work on my laptop, granted the shaders are a little rough around the edges, at least I can test out and code on my laptop now.Here's how to do it if anyone cares
void GFXD3DDevice::init( const GFXVideoMode &mode )
{
#ifdef TORQUE_SHIPPING
// Check DX Version here, bomb if we don't have the minimum.
// Check platformWin32/dxVersionChecker.cpp for more information/configuration.
AssertISV( checkDXVersion(), "Minimum DirectX version required to run this application not found. Quitting." );
#endif
// Set up the Enum translation tables
GFXD3DEnumTranslate::init();
mVideoMode = mode;
// Create D3D Presentation params
D3DPRESENT_PARAMETERS d3dpp = setupPresentParams( mode );
#ifdef TORQUE_NVPERFHUD
HRESULT hres = mD3D->CreateDevice( mD3D->GetAdapterCount() - 1, D3DDEVTYPE_REF, winState.appWindow,
D3DCREATE_MIXED_VERTEXPROCESSING | D3DCREATE_MULTITHREADED,
&d3dpp, &mD3DDevice );
#else
// Vertex processing was changed from MIXED to HARDWARE because of the switch to a pure D3D device.
// If this causes problems, please let me know: patw@garagegames.com
// Set up device flags from our compile flags.
U32 deviceFlags = 0;
D3DCAPS9 caps;
mD3D->GetDeviceCaps(D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL, &caps);
if( caps.DevCaps & D3DDEVCAPS_HWTRANSFORMANDLIGHT )
deviceFlags |= D3DCREATE_HARDWARE_VERTEXPROCESSING;
else
deviceFlags |= D3DCREATE_SOFTWARE_VERTEXPROCESSING;
// If pure device and HW T&L supported
if( caps.DevCaps & D3DDEVCAPS_PUREDEVICE &&
deviceFlags & D3DCREATE_HARDWARE_VERTEXPROCESSING)
deviceFlags |= D3DCREATE_PUREDEVICE;
// Add the multithread flag if we're a multithread build.
#ifdef TORQUE_MULTITHREAD
deviceFlags |= D3DCREATE_MULTITHREADED;
#endif
HRESULT hres = mD3D->CreateDevice( D3DADAPTER_DEFAULT, D3DDEVTYPE_HAL,
winState.appWindow, deviceFlags,
&d3dpp, &mD3DDevice );
#endif
//regenStates();
// Gracefully die if they can't give us a device.
if(!mD3DDevice)
{
Platform::AlertOK("DirectX Error!", "Failed to initialize Direct3D! Make sure you have DirectX 9 installed, and "
"are running a graphics card that supports Pixel Shader 1.1.");
Platform::forceShutdown(1);
}
// Check up on things
Con::printf(" Cur. D3DDevice ref count=%d", mD3DDevice->AddRef() - 1);
mD3DDevice->Release();
mTextureManager = new GFXD3DTextureManager( mD3DDevice );
// Now re aquire all the resources we trashed earlier
reacquireDefaultPoolResources();
// Setup default states
initStates();
//-------- Output init info ---------
mD3DDevice->GetDeviceCaps( &caps );
U8 *pxPtr = (U8*) &caps.PixelShaderVersion;
mPixVersion = pxPtr[1] + pxPtr[0] * 0.1;
Con::printf( " Pix version detected: %f", mPixVersion );
bool forcePixVersion = Con::getBoolVariable( "$pref::Video::forcePixVersion", false );
if( forcePixVersion )
{
float forcedPixVersion = Con::getFloatVariable( "$pref::Video::forcedPixVersion", mPixVersion );
if( forcedPixVersion < mPixVersion )
{
mPixVersion = forcedPixVersion;
Con::errorf( " Forced pix version: %f", mPixVersion );
}
}
U8 *vertPtr = (U8*) &caps.VertexShaderVersion;
F32 vertVersion = vertPtr[1] + vertPtr[0] * 0.1;
Con::printf( " Vert version detected: %f", vertVersion );
mCardProfiler = new GFXD3DCardProfiler();
mCardProfiler->init();
gScreenShot = new ScreenShotD3D;
}
I basically just get the caps to see if hardware T&L is supported if its not I create a Software Vertex processing device. Then If hardware T & L is supported I create a pure device so it will work with better computers. Hope this helps
Adam
#15
03/12/2007 (12:38 pm)
Just some general advice for those of you buying new machines - don't buy them with integrated graphics! They suck - they are not meant for realtime 3D development. Do yourself a favor and shell out the extra money for laptops and PC's that support discrete cards.
#16
03/13/2007 (7:33 am)
Agreed, I'm really disappointed with my laptop, but I didn't have time to shop around because I needed a computer that day. Poor planning on my part, but it gets the job done for now. In another two years I'll upgrade to a better one.
#17
I'm working on a 7600Go core duo notebook, so I know the problems that arise from this situation.
My failure was that I did not research enough and believed in the name faking ... because 7600Go is only 7300GT, not a modified 7600. Which means 5 VSU and 8 PSU (which are capable of 16 shader ops per clock tick).
Sounds good in theory. In reality its a little different. The MS4 demo with the selfshadowing run at best at 50 FPS. And that was a small sandbox situation ... Add in water and Atlas2 and you are at 25 FPS at best.
Not anywhere next to usefull for testing purpose so I'll get a desktop system when Intel launches the Q6400 and does the mentioned price reduction on the Core 2 series as well.
03/13/2007 (8:59 am)
If you develop with TGEA, in 4 months is more real than 2 years because it won't get much done unless you selectively test each shader on its own or want to test its capabilities as a non realtime rendering engine or just do not use any additional shader effect at all. But in that case the question would be why to use TGEA if you do not get much more out of it than with TGE.I'm working on a 7600Go core duo notebook, so I know the problems that arise from this situation.
My failure was that I did not research enough and believed in the name faking ... because 7600Go is only 7300GT, not a modified 7600. Which means 5 VSU and 8 PSU (which are capable of 16 shader ops per clock tick).
Sounds good in theory. In reality its a little different. The MS4 demo with the selfshadowing run at best at 50 FPS. And that was a small sandbox situation ... Add in water and Atlas2 and you are at 25 FPS at best.
Not anywhere next to usefull for testing purpose so I'll get a desktop system when Intel launches the Q6400 and does the mentioned price reduction on the Core 2 series as well.
#18
So I Bought an awesome ATI Card last weekend, and TGEA runs super fast on it,
Now Im Researching Shaders like its going out of style.
03/13/2007 (9:23 am)
The only solution for my problem, was what Brian suggested....So I Bought an awesome ATI Card last weekend, and TGEA runs super fast on it,
Now Im Researching Shaders like its going out of style.
#19
03/13/2007 (5:20 pm)
@Marc - that 7600 Go may not be the best, but I'm pretty sure it's running waaay faster than an integrated solution. At least you can create simple missions and do some work/testing of gameplay there. Ie. you prob don't always need water and self-shadowing on to test your game, and 50 fps is fairly usable. If it were your only dev station though, that would be a pain.
#20
The main thing is that anyone who reads this can learn from my mistake. My desktop had some major problems and I needed a computer that day to get some major things done. So I went to about 6 different stores and none of them had any decent laptops for games. So I ended up buying the best one I could find that day. Moral of the story don't put yourself in that situation. If I would of had time I probably would have ordered one from Alienware. Turns out my desktop's CPU fan was broken and my GPU was overheating all at the same time. Which is why the computer would boot up and then crash immediately. I ended up being able to fix my desktop myself a few days later.
03/14/2007 (8:38 pm)
Using the TGEA water demo as a testbed for my integrated graphics I get between 25-50 fps depending on the amount of geometry in view. The average looks to be around 33 fps which is way better than I thought I was getting. Again though the shaders are pretty rough looking because they are being processed in software instead of on the GPU. I'm tempted to sell my laptop and upgrade to something with much better graphics. The thing is I mainly use my desktop which has a Nvidia 6800 GT that does a really good job rendering so I will probably just deal with the poor graphics performance until later on.The main thing is that anyone who reads this can learn from my mistake. My desktop had some major problems and I needed a computer that day to get some major things done. So I went to about 6 different stores and none of them had any decent laptops for games. So I ended up buying the best one I could find that day. Moral of the story don't put yourself in that situation. If I would of had time I probably would have ordered one from Alienware. Turns out my desktop's CPU fan was broken and my GPU was overheating all at the same time. Which is why the computer would boot up and then crash immediately. I ended up being able to fix my desktop myself a few days later.
Jon Wilsdon
Default Studio Name