Nvidia Optimus - odd & grrr & sigh & hmmph
by TheGasMan · in General Discussion · 07/25/2011 (3:34 am) · 28 replies
Noted as per image below:
1. T3D will no longer find the GT555 no matter what I try
2. The GT555 is clearly working
3. Nvidia panel says: GT555 is card being utilized

..the graphics of the game show as a mix between low and high settings; some trees show at low res, some grass shows mid res
..so wonk. specs below.
win7 64
Q2630 4core/8threads
6GB ram
IntelHD/GT555 Optimus
- Everything is up to date, other games work well, T3D worked correctly for a few runs, then defaulted to Intel again.
1. T3D will no longer find the GT555 no matter what I try
2. The GT555 is clearly working
3. Nvidia panel says: GT555 is card being utilized

..the graphics of the game show as a mix between low and high settings; some trees show at low res, some grass shows mid res
..so wonk. specs below.
win7 64
Q2630 4core/8threads
6GB ram
IntelHD/GT555 Optimus
- Everything is up to date, other games work well, T3D worked correctly for a few runs, then defaulted to Intel again.
About the author
gameartstore.com
Recent Threads
#2
What i found is that if its set to the "auto select" mode in the NVidia control panel that the driver under the hood will decide what device to pick.
If i forced it to the high end device i got a choice of the intel and nvidia devices... but i couldn't select the intel one (i got no 'Apply' button). I suspect T3D is seeing them as a single device and doesn't know they are different for some reason.... and that i would call a bug to fix in T3D 1.2.
07/25/2011 (2:44 pm)
Oh... just realized my laptop here is also one of these "switchable" GPU models.What i found is that if its set to the "auto select" mode in the NVidia control panel that the driver under the hood will decide what device to pick.
If i forced it to the high end device i got a choice of the intel and nvidia devices... but i couldn't select the intel one (i got no 'Apply' button). I suspect T3D is seeing them as a single device and doesn't know they are different for some reason.... and that i would call a bug to fix in T3D 1.2.
#3
However, as you can see I've dealt with this before but I can't be bothered to do this weekly:
www.garagegames.com/community/forums/viewthread/126390
..if I find/see anything worth knowing, I'll stop back and post here.
If you guys can sort this out for 1.2, that would be stellar.
Thx Tom
P.S.
[IYDK Dell and Asus are selling Optimus based laptops at alarming rates..]
07/25/2011 (8:43 pm)
in regards to T3D: if I force the Nvidia, I still get Intel...as you can see in the image (GFX NULL).However, as you can see I've dealt with this before but I can't be bothered to do this weekly:
www.garagegames.com/community/forums/viewthread/126390
..if I find/see anything worth knowing, I'll stop back and post here.
If you guys can sort this out for 1.2, that would be stellar.
Thx Tom
P.S.
[IYDK Dell and Asus are selling Optimus based laptops at alarming rates..]
#4
07/26/2011 (2:21 pm)
@E.B. I went ahead and logged this under ticket THREED-2323.
#6
07/26/2011 (2:43 pm)
Not at this time but we will ask you if we do. Thanks.
#7
recommended settings forced to low via automatic settings control.
This is not ideal guys..Optimus has been flooded into the market by many manufacturers.
- This needs a big bump and to be a top notch bug to be fixed for 1.3. or sooner.
11/22/2011 (1:31 am)
[Not fixed in 1.2]recommended settings forced to low via automatic settings control.
This is not ideal guys..Optimus has been flooded into the market by many manufacturers.
- This needs a big bump and to be a top notch bug to be fixed for 1.3. or sooner.
#8
11/22/2011 (8:58 am)
@Gas - You try this on an unmodified version of 1.2? There are both C++ and TorqueScript changes in 1.2 which resolve this problem.
#9
What system type were you using Tom ?
I'll take a stab at writing a fix if I have knowledge of a system(that works with your changes) with which to compare mine.
11/22/2011 (9:26 pm)
I've tried all 1.2 templates/examples. (System Specs are in initial post.)What system type were you using Tom ?
I'll take a stab at writing a fix if I have knowledge of a system(that works with your changes) with which to compare mine.
#11
02/15/2012 (4:00 am)
This is happening to me too. Latest NVidia drivers (GeForce GT 540M) and the Intel HD Graphic (D3D9) keeps getting selected. Resolved yet? I am running 1.2 fresh install.
#12
02/16/2012 (5:25 am)
I never had the time to look into this & I've moved to a different package until T3D gets "sorted" across the board, if ever. Sorry that I am no help in this post Andy.
#13
Is this going to be fixed?
<dummySpitMode>
I dont want a desktop computer and I wont buy one mkay? I want to be able to use my nVidia card I paid extra for in my laptop please. I don't want to use that crap Intel HD D3D9 driver thanks.
</dummySpitMode>
03/29/2012 (5:10 pm)
Bump!Is this going to be fixed?
<dummySpitMode>
I dont want a desktop computer and I wont buy one mkay? I want to be able to use my nVidia card I paid extra for in my laptop please. I don't want to use that crap Intel HD D3D9 driver thanks.
</dummySpitMode>
#14
> Windows 7 Home Premium 64-bit
> 3rd Generation Intel i7-3610QM 2.3Ghz Quad Core
> Intel HD Graphics 4000
> Nvidia GeForce GT 650M Graphics with 2Gb GDDR5 memory
> 8Gb DDR3 1600 memory
> Dual 750Gb 7200Rpm Hard Drives
> 32Gb mSSD Hard Drive Cache
I paid a whole lot of money for this new laptop and then I came and bought a bunch of content packs and things from GarageGames all excited to begin working with T3D. However, whenever I open up T3D it will not give me the option to run it with my Nvidia graphics card, it only has the option to run with my Intel graphics. The Intel HD 4000 still looks nice on high settings, but there is a noticable amount of lag and if I were to start building decently large levels, I can see that being a big problem eventually. I would really like to use my Nvidia 650m graphics card but cannot for the life of me get it to work. I have even opened my Nvidia control panel and set T3D's .exe to work with Nvidia instead of the Intel graphics but that didn't even work. Windows is up to date, directX is up to date, and all of my drivers are up to date. Does anyone know if this problem has been fixed yet?
08/09/2012 (2:42 pm)
Hey, just got a brand new laptop shipped to me two days ago. Installed T3D 1.2 (fresh package) and started up the demo. However, I am experiencing the same problem as most of the people in this thread. Here are the specs for my laptop:> Windows 7 Home Premium 64-bit
> 3rd Generation Intel i7-3610QM 2.3Ghz Quad Core
> Intel HD Graphics 4000
> Nvidia GeForce GT 650M Graphics with 2Gb GDDR5 memory
> 8Gb DDR3 1600 memory
> Dual 750Gb 7200Rpm Hard Drives
> 32Gb mSSD Hard Drive Cache
I paid a whole lot of money for this new laptop and then I came and bought a bunch of content packs and things from GarageGames all excited to begin working with T3D. However, whenever I open up T3D it will not give me the option to run it with my Nvidia graphics card, it only has the option to run with my Intel graphics. The Intel HD 4000 still looks nice on high settings, but there is a noticable amount of lag and if I were to start building decently large levels, I can see that being a big problem eventually. I would really like to use my Nvidia 650m graphics card but cannot for the life of me get it to work. I have even opened my Nvidia control panel and set T3D's .exe to work with Nvidia instead of the Intel graphics but that didn't even work. Windows is up to date, directX is up to date, and all of my drivers are up to date. Does anyone know if this problem has been fixed yet?
#15
I am writing this here, unfortunately, because I bought a license to 1.1, updated to 1.2, and, more than a year later, Garage Games still has not come up with anything to make Torque 3D functional on laptops with two GPUs.
I mean, c'mon, you can't leave an issue like this behind and still market your engine based on the assumption that developers necessarily work on desktop computers and laptops do not run games - this is 2012 and such mobile GPUs are more than enough to run most current games and engines like Unity and Shiva.
Even though I have moved to GMS and Shiva - in part, due to their support, mind you - I would really like to develop a FPS with Torque3D, I paid good money for my 1.1 and 1.2 licences and for Torsion, my system is on par with the declared hardware requirements, but I cannot due to this GPU recognition bug that is not mentioned anywhere on the product page. Developers, is there any chance of this being fixed? If so, when?
Needless to say, the way things are right now, it seems really unlikely - impossible, in a matter of fact - that I'll update to 1.3 if this is not fixed without extra charges. Please, do not let your user base feel silly for investing in T3D and do not leave us with the impression that we are talking to a brick wall.
08/19/2012 (2:53 pm)
Hi everyone,I am writing this here, unfortunately, because I bought a license to 1.1, updated to 1.2, and, more than a year later, Garage Games still has not come up with anything to make Torque 3D functional on laptops with two GPUs.
I mean, c'mon, you can't leave an issue like this behind and still market your engine based on the assumption that developers necessarily work on desktop computers and laptops do not run games - this is 2012 and such mobile GPUs are more than enough to run most current games and engines like Unity and Shiva.
Even though I have moved to GMS and Shiva - in part, due to their support, mind you - I would really like to develop a FPS with Torque3D, I paid good money for my 1.1 and 1.2 licences and for Torsion, my system is on par with the declared hardware requirements, but I cannot due to this GPU recognition bug that is not mentioned anywhere on the product page. Developers, is there any chance of this being fixed? If so, when?
Needless to say, the way things are right now, it seems really unlikely - impossible, in a matter of fact - that I'll update to 1.3 if this is not fixed without extra charges. Please, do not let your user base feel silly for investing in T3D and do not leave us with the impression that we are talking to a brick wall.
#16
I know one thing for sure. I will NOT be purchasing any more products from Garage Games until they add support for laptops with dual graphics, period.
08/19/2012 (5:59 pm)
@Pedro: my thought exactly. And apparently several others share these same thoughts too... I know one thing for sure. I will NOT be purchasing any more products from Garage Games until they add support for laptops with dual graphics, period.
#17
I'll check with our NVidia rep, but it may not be a simple engine solution so much as something that developers need to submit to NVidia once their game is complete.
08/20/2012 (10:25 am)
We will have to work with NVidia specifically to create a Torque 3D profile. However, there is no guarantee that any final game made in Torque 3D will be accessible with these specs since it may diverge from the core profile during development. Adding in major resources like AFX or the 3D Action Adventure Kit or the shader resources could significantly change profile. This is speculation from reading a large number of posts on the NVidia forums over the weekend discussing this issue. It seems that bug patches can change the profile if they do any sort of graphical changes such as supporting new resolutions, etc. Frank's also posted a number of development links in the other thread. If we were a binary-only engine, it would be a lot easier since that is similar to shipping a single binary product.I'll check with our NVidia rep, but it may not be a simple engine solution so much as something that developers need to submit to NVidia once their game is complete.
#18
08/20/2012 (10:34 am)
It's very likely that would come out just like the Symantec issue as they most likely have that based on md5 hashes. If that's the case then the profile submitted would work only for the stock engine out of the box, rebuild it once (even if no code changed) and the profile is immediately invalid.
#19
There should be a better system via Nvidias' end of this..profiles NEED to be user-creatable. Users need to be able to fix these issues issue within minutes.
@Scott; I would hope that the Nvidia people are not that stupid. :D
08/20/2012 (10:40 am)
@DMB: plz mention to them, this obvious notion. thx..There should be a better system via Nvidias' end of this..profiles NEED to be user-creatable. Users need to be able to fix these issues issue within minutes.
@Scott; I would hope that the Nvidia people are not that stupid. :D
#20
Thank you all very much for your replies - it's start to know this is, indeed, an issue more users are also stuck with.
David and Scott, thank you for your fast reaction to this discussion. It's certainly reassuring to get feedback from the developers, so I am also more optimistic about the prospect of getting something done in T3D.
Is it realistic of me to expect news on this, perhaps some fortunate development of GarageGames contacting Nvidia, or should the affected users, myself included, contact Nvidia directly and request a functional T3D profile?
08/20/2012 (11:39 am)
Hello, everyone.Thank you all very much for your replies - it's start to know this is, indeed, an issue more users are also stuck with.
David and Scott, thank you for your fast reaction to this discussion. It's certainly reassuring to get feedback from the developers, so I am also more optimistic about the prospect of getting something done in T3D.
Is it realistic of me to expect news on this, perhaps some fortunate development of GarageGames contacting Nvidia, or should the affected users, myself included, contact Nvidia directly and request a functional T3D profile?
Associate Tom Spilman
Sickhead Games
I can't say i've tried Torque on one of those... i'm sure it is using the Geforce GPU, but its just reporting it wrong in the options dialog.
I don't even know if games can pick which one to use... it may just spin up the Geforce when it detects a heavy graphics load.