is the physX demo GPU or CPU driven? is it changeable?
by deepscratch · in Torque 3D Professional · 12/05/2010 (4:12 am) · 12 replies
title says it all
wanna know if the physX runs on the GPU?
anyone know??
ta
wanna know if the physX runs on the GPU?
anyone know??
ta
About the author
email me at medan121@gmail.com
Recent Threads
#2
thanks Henry, you been an awesome help everywhere today!!!
12/05/2010 (2:04 pm)
in tgea i had a switch to run physx on HW, and it worked, i havent messed with physx since tgea, so dont want to redo the switch if its already done.thanks Henry, you been an awesome help everywhere today!!!
#3
HW is disabled by default, I've added my HW SW switch, and it now shows running on HW, gonna run some tests and check performance differences
12/05/2010 (2:50 pm)
ooo kkkk,HW is disabled by default, I've added my HW SW switch, and it now shows running on HW, gonna run some tests and check performance differences
#5
12/05/2010 (5:40 pm)
I think the big performance improvements show up when you actually get 1000's of physics objects in the scene, ie when it would normally slow the CPU to a crawl it'll keep going. But either way good to hear it works, going to try it out myself later today. =)
#6
12/05/2010 (5:58 pm)
This is probably why I noticed that my Nvidia card isn't taking over for Beta 3 like it did in Beta 2 ...
#7
Henry, if I post the switch code, would you maybe feel like seeing if it really works?
12/05/2010 (6:05 pm)
mine doesn't start whining either Steve, gtx295, so even though it says it is HW, I think its not.Henry, if I post the switch code, would you maybe feel like seeing if it really works?
#8
That said, I'd be happy to do some crazy stress tests to see if it's actually doing anything helpful. I have a feeling there might be some overhead in the Torque object system (ie having 5000 Torque objects in a scene may itself be crushing the CPU regardless of the physics stuff) that may put a bottleneck on how much the GPU can really improve performance.
12/06/2010 (12:38 am)
To be honest the only method I know of seeing if PhysX is in GPU mode is to go to the NV CP, click on the "Set PhysX configuration" and then go to the 3D settings menu at the top (which only appears when you're in that section of the CP) and check "Show PhysX Visual Indicator." When you load the PhysX demo it should render some text in the upper-left corner indicating which mode it's in.That said, I'd be happy to do some crazy stress tests to see if it's actually doing anything helpful. I have a feeling there might be some overhead in the Torque object system (ie having 5000 Torque objects in a scene may itself be crushing the CPU regardless of the physics stuff) that may put a bottleneck on how much the GPU can really improve performance.
#9
pxWorld.cpp
pxWorld.h
interesting to see what results you can get, I'm doing 1000+ object experiments now, and theres a huge bottleneck, this should be done on the gpu, and its definately not, even with the switch in
12/06/2010 (1:08 am)
here are the two pxWorld files, small changes, they drop straight inpxWorld.cpp
pxWorld.h
interesting to see what results you can get, I'm doing 1000+ object experiments now, and theres a huge bottleneck, this should be done on the gpu, and its definately not, even with the switch in
#10
well, nvidia say that only fluid and cloth are actualy capable of being run on the gpu,
physx rigidbodies, no.
so I guess we stuck with pcu only physx, till hopefully sdk v3.0.
so there you go.
12/28/2010 (5:44 pm)
ha!well, nvidia say that only fluid and cloth are actualy capable of being run on the gpu,
physx rigidbodies, no.
so I guess we stuck with pcu only physx, till hopefully sdk v3.0.
so there you go.
#11
Moving forward with 16 core CPUs on the horizon i think general purpose GPU physics will go the way of the "sound card".
12/28/2010 (8:21 pm)
Yea... GPU support in PhysX is not that great from what i found. I saw no reason to complicate things any further by having physics slow down the rendering performance of the card.Moving forward with 16 core CPUs on the horizon i think general purpose GPU physics will go the way of the "sound card".
#12
if, like me, you purchased a new video card recently, and your old card has a PPU, not a GPU, a PPU(physX proccesing unit, the older physX cards, you know),
and if you have extra video card slots on your board, you can put the old card back in, next to the new GPU card, and the Nvidia panel assigns the old card as a dedicated physx proccesor automatically.
just dont connect anything to the outputs.
I have a GTX295, and my old GTS8800 in the same box now, and it seems to work, with other physx games at least,
I'm getting better frame rate, and better graphics, the GTX deals only with graphics and the GTS deals only with physics.
give it a try if you got an old PPU card laying around. (just be sure your PSU can handle the extra load.)
hey Steve, the 2 cards together make a god awful whining sound now when pushed a bit!!
12/29/2010 (12:12 pm)
here is a related interest point:if, like me, you purchased a new video card recently, and your old card has a PPU, not a GPU, a PPU(physX proccesing unit, the older physX cards, you know),
and if you have extra video card slots on your board, you can put the old card back in, next to the new GPU card, and the Nvidia panel assigns the old card as a dedicated physx proccesor automatically.
just dont connect anything to the outputs.
I have a GTX295, and my old GTS8800 in the same box now, and it seems to work, with other physx games at least,
I'm getting better frame rate, and better graphics, the GTX deals only with graphics and the GTS deals only with physics.
give it a try if you got an old PPU card laying around. (just be sure your PSU can handle the extra load.)
hey Steve, the 2 cards together make a god awful whining sound now when pushed a bit!!
Torque Owner Henry Todd
Atomic Walrus
sceneDesc.simType = NX_SIMULATION_SW; // [9/28/2009 Pat] Why is this software? Should be software server, hardware client?
So that leads me to believe it's running software. The comment sounds right to me, the server should probably generally run software while the client runs hardware, unless it's singleplayer in which case they can both be HW. Maybe this was set to always software because they hadn't gotten around to detecting whether or not hardware support existed?
Couldn't hurt to try NX_SIMULATION_HW, worst case it crashes or something. Best case PhysX already knows how to detect whether it can use hardware or not and you can just always leave this set to HW.