Game Development Community

DX10 fun.. Whats to come??

by Phil Carlisle · in Torque Game Engine Advanced · 11/10/2006 (3:49 am) · 41 replies

Is it just me, or is anyone else excited about the DX10 potential, now that real life cards are available?

Reading the specs on the 8800 and the SM4 stuff, it just seems like a whole new world is opening up.

Not that I'm suggesting indies necassarily go down that route, but it DOES open up potential for us IF we care to go down the "high end tech" kind of route.

For instance, well, instancing :) that opens up a world of really high quality RTS style stuff.

Or the increase in shader constants, lends itself to a load of things.

Then there's the possibilities of things like truly tesselated deformable water with the geometry shader.

So many things. Of course it wont actually make making games any easier. But it does open up some new ground (I think), much more than any previous push in technology since the introduction of shaders I guess.

At least for those of us interested in graphics research its a nice time. What with Quad-core CPU's and high-end SLI enabled 8800 graphics cards. I could spend the money that would buy me a small car on a PC easily!

Hey.. next week I'm at a conference with Ken Perlin! so I'm kind of revelling in graphics-ness right now! :)

Phil.
#21
11/18/2006 (3:58 am)
@Jonathan

Vista was using around 700mb idle. A fresh install with nothing running. Windows XP now uses 423 mb for me right now with quite some programs running including visual studio.
#22
11/19/2006 (2:12 pm)
Beta 2? Or RC1? I never used beta 2, but RC1 is only eating in the upper 300s. I'm going off of what task manager says, and there's a good chunk paged out at this moment, but still nothing near 700mb.
#23
11/19/2006 (11:56 pm)
I used RC1. My collegue reports something similar. His was around 500-550.
#24
11/21/2006 (6:47 pm)
(Phil)
>>Because all my vid cards are ATI now (they give me freebies every now and then).

Heh, that's funny. :-) - and here's why: my wife works for ATI (well now she works for AMD, too :-) and I won't have any of their video cards in my computers. I don't have that kind of time available... *rollseyes*
#25
11/22/2006 (1:44 am)
Hehhee.. I think ATI a few years ago were pretty poor to be fair. But I think since the 9800 I've become a convert. The reason was that I was having problems with it at first, then I upped the voltage on the bus and the card has worked perfectly ever since.

Which leads me to conclude that instability issues with ATI cards arent inherent in the design, but are a product of configuration.

I know a good number of games companies that are ALL ati based. This also suggests that ATI can be competative. Not that I'm trying to sing ATI's praises, I just think that recent ATI history has turned the corner for me in my perception of them.
#26
11/22/2006 (4:10 pm)
I'm ati aswell, never had any kind of trouble with my 9800xt, and it still works like a charm.
#27
11/22/2006 (5:45 pm)
Ewww, ATI has cooties.

nVidia <--- a real video card, one without the AMD cooties

To be honest, I like the color green more than red, nVidia always has cooler tech demos, and I absolutely despise catalyst (yet nVidia goes to the control panel which is almost like catalyst, but thankfully not so San Francisco as catalyst).
#28
12/06/2006 (4:47 am)
It seems to me as if everyone is looking at dx10 as just another prettier face... now, while this may be understandable for the end users of the new dx10 games, i would imagine that developers might take a while to look a lil deeper into the subject before forming their opinions...

the new eyecandy is just one aspect that the developer/coder should be considering...

not being anything of an expert, or not having much hands on, i'm gonna refer you to a video where someone who has had a lil more hands on goes into a bit more depth on what dx10 has to offer... this will give a lil insight into some of the more compelling (don't ya just love that word when used in a marketing context) reasons to take dx10 as being more than just something new to look at...

first, a compelling eyecandy shot...

www.nzone.com/docs/IO/36967/screenshot3.jpg
from the upcoming version of FPSC... yes, that cute, largely dismissed as a clicktogether game maker tool by the people who make DarkBASIC... they've shown some real insight as far as this goes, and regardless of the outcome of their efforts, it's a noteworthy step forward...

now for the compelling video :)

seriously... there's some interesting information in here as to what dx10 is really all about... this *IS* worth watching through... files.thegamecreators.com/newsletter/TGC_8800.zip

--Mike
#29
12/06/2006 (5:57 am)
Lee is a great guy and does fantastic work.. It goes to show where DX10 can give a good win though. Lots of stuff thats actually pretty new, I dont think Lee shows too much of the new potential there, although I did like the character stuff.

I'm kind of hoping for more "out there" uses for the new tech though, basically what you see in that video is merely extensions of existing techniques in order to utilize the card more..

Procedural content especially is now more than ever an interesting technique.
#30
12/06/2006 (4:29 pm)
I find Hellgate london's "rain fins" to be a rather creative use of the new tech. Without the hassle of adding invisible geometry during the creation of objects they're able to use geometry shaders to add polys to the edges of objects and then dynamically texture them so that it appears as though rain is splattering on a wall or on the top of a car. Sure you could hack in something like rainfins, but then you have to consider it during art asset creation or take a hit on cpu performance to do the work. DX10 takes the hit off of the CPU, and that's what so great.

However, I'm steering away from DX10 because OpenGL supports geometry shader on any operating system. So DX10 can kiss it. Of course porting OGL to TGEA is probably going to be harder than DX10, and is definitely beyond my capabilities.



Someone really should take on the addition of OGL or TGEA as a pack project. GG could focus more on the core of TGEA and not have to even worry about porting to either, and someone would make some money.

I'd gladly dish out 100 - 150 for OGL or DX10 additions to TGEA.
#31
12/11/2006 (12:00 am)
It would be awesome if GG surprised us all and announced that TGEA is DX9c AND DX10 ready.
#32
12/11/2006 (2:42 am)
As long as they dont make it dx10 only. Then i'm going to cry.
#33
12/12/2006 (4:30 am)
Word is they're skipping DX9.x and DX10 and going right for DX11.
#34
12/12/2006 (6:12 am)
Heh. And where have you read this? ;)

That FPS Creator demo was neat.
#35
12/12/2006 (6:38 am)
I think it's a joke :P
#36
12/12/2006 (7:11 am)
;-) ;-)
#37
12/30/2006 (6:57 pm)
The real reason to be excited about DX10 is mainly the capability to do procedural geometry divorced from the CPU. Personally I've been researching and waiting for procedural games to become more feasiblee in terms of graphics for a long time, and I think DX10's ideas in general are a step in the right direction.

I haven't heard anything about extensions for OGL or support for geometry shader in it. Anyone care to elaborate on that?
#38
12/30/2006 (7:33 pm)
Yep geometry shader is there for OGL. Currently it's a vendor extension (nVidia of course as they're the only ones with capable hardware). I'm sure it'll become more standardized a couple months after ATI catches up.

If you want to mess with it you go to nVidia's website and download the extensions. Originally you had to emulate support of it but the current 94.77 drivers no longer require you to enable your capable card to emulate (out of the box support in other words).

There's a little project out there somewhere of bezier curves on the shader but the zip download is empty. I've been trying to get up to speed on OGL but I'm definitely not ready to start messing with shaders.
#39
12/30/2006 (8:14 pm)
To elaborate a bit more on the OpenGL stuff, the day the 8800 was released, nVidia released well over a dozen extension specifications. Half are nVidia proprietary, the other half are multivendor. When ATI releases a card with similar capabilities they will be able to implement those OpenGL extensions in their drivers, if they so choose (they probably won't).

However, these extensions have a life expectancy of one year. In summer (read: SIGGRAPH) the Khronos Group is releasing the next major version of OpenGL, and in October 2007 they will release another major version of OpenGL which incorporates all of the latest D3D10 features. So, official OpenGL support for geometry shaders, texture arrays, etc. isn't for another year.

And then us developers get to deal with 3 different active versions of OpenGL (at least), all targeting different hardware and programming styles.

But ya, that's how the D3D10 features are going into OpenGL.
#40
12/31/2006 (2:21 am)
I think I'll stick with DX10 thanks..