Game Development Community

Problems with Nvidia Optimus and Torque3D 1.2?

by Weston Elliott · in Torque 3D Professional · 08/10/2012 (6:14 am) · 22 replies

I posted something similar in a rather old thread started 07/25/2011:
http://www.garagegames.com/community/forums/viewthread/127071

I decided to start a new thread about it here in this section of the forums because I think the Torque3D Private area will grab answers more focused to Torque3D.

Anyway, I just got a brand new laptop less than a week ago which I spent a small fortune on. Here are the specs:
> Windows 7 Home Premium 64-bit
> 3rd Generation Intel i7-3610QM 2.3Ghz Quad Core
> Intel HD Graphics 4000
> Nvidia GeForce GT 650M Graphics with 2Gb GDDR5 memory
> 8Gb DDR3 1600 memory
> Dual 750Gb 7200Rpm Hard Drives
> 32Gb mSSD Hard Drive Cache

While waiting for my new system, I excitedly purchased Torque3D 1.2 as well as several content packs and tools like Torsion. With a system as good as this, I figured I would be able to run torque and create my worlds worry free. However, whenever I open up Torque3D it will not give me the option to run it with my Nvidia graphics card, it only has the option to run with my Intel graphics. The Intel HD 4000 still looks nice on high settings, but there is a noticeable amount of lag and if I were to start building decently large levels, I can see that being a big problem eventually. I would really like to use my Nvidia 650m graphics card but cannot for the life of me get it to work. I have even opened my Nvidia control panel and set Torque3D's .exe to work with Nvidia instead of the Intel graphics but that didn't even work. Windows is up to date, directX is up to date, and all of my drivers are up to date.

TheGasMan (the person who started the original thread in the General area of the forums which is linked to at the top of this post) had a very good point. Laptops with the Nvidia Optimus technology are flooding the market and most if not all the systems I looked at which used Nvidia had the switchable graphics feature. Does anyone know if this problem has been fixed yet? If not, are there any plans on fixing it? I can see this becoming a big issue in the future with so many systems being sold with the switchable graphics option...
Page «Previous 1 2
#1
08/10/2012 (9:39 am)
Unfortunately, there is not really anything can be done within Torque to fix this as Torque is doing exactly what it is suppose to do and that is enumerate through all available graphics adapters and present them to the graphics options to choose from, and uses this same enumeration to figure out if the device chosen within the preferences file is valid and use it if so. If the operating system or graphics drivers aren't providing the device in the standard enumeration callbacks then something is wrong with the graphics drivers or whatever is handling the Nvidia Optimus switch over.

Numerous problems with other games and engines have been reported with Nvidia Optimus, and the only working solutions is to make sure you grab latest graphics drivers from nVidia's website and to go into the control panel and setup a profile for the game executable to be ran under the nVidia device instead. If that doesn't work then you're pretty much out of luck. Note that I've read some cases where you have to run the game several times for it to see the nVidia device as it's a bug within nVidia's Optimus software.
#2
08/10/2012 (10:39 am)
in that case u better try to optimize your t3d's visual settings for Intel HD 4000.

i also have Intel HD 4000.and t3d is running smoothly.only need proper settings.
#3
08/10/2012 (3:39 pm)
Well that stinks. I really haven't noticed any problems with other games and the Nvidia card within my system. It seems that Torque3D is the only program that cannot detect my Nvidia card. All the drivers are up to date as mentioned above and I set a profile for Torque3D from within my Nvidia control panel already. Nothing seems to work. However, every other game I have played on this laptop so far seems to detect the Nvidia graphics card just fine and without me even having to do anything. These games include:
> Battlefield 3
> COD Black Ops
> Red Dead Redemption
> Borderlands
> Red Alert 3
> Sim City 4
> Minecraft (actually as no option to choose a graphics card on this one, but setting a profile for it in my Nvidia control panel works just fine).

Since my laptop is so new, these are the only games I've tried on it. However, each of these games work just fine with my Nvidia card. In fact, setting profiles for Gimp and Blender seems to force them to use my Nvidia card as well. It seems the only program I am having troubles with is Torque3D...
#4
08/10/2012 (5:52 pm)
Does the console shed any light as to what is detected?

Mine shows a DirectX Device detected and a Null device detected.

Maybe there are some messages in there that might help.
#5
08/10/2012 (8:20 pm)
sorry for this off topic.

hey weston,
what means "Red Dead Redemption"?
is that pc version?
as far i know there was only console version of this game.
!!!!!!!!!!!!!!!!!!!!!!!!!
i am eagerly waiting for that rockstar game.
#6
08/11/2012 (10:28 am)
@ahsan,
My apologies, I meant Dead Island, not Red Dead Redemption. I have Red Dead Redemption for PS3 and Dead Island for PC and I have no idea how I got those two mixed up.

@Frank,
You know, I haven't even opened my console to see what it says. I will have to do that later tonight...
#7
08/12/2012 (1:30 am)
MAJOR FLAW. They've known for years.


#8
08/12/2012 (10:54 am)
Quote:MAJOR FLAW. They've known for years.
Yeah I can see this. I have found a few discussions on this topic here and there while searching for a solution on the Torque forums. I even found your old topic, TheGasMan, which was the most extensive and I can see where a rep from Garage Games said he would put in a ticket for it, but then months later when Torque3D 1.2 was released, there was never any fix for this issue included within it. It almost seems like they blew you (and anyone else having these issues) off completely.

I think I have pretty much narrowed it down to the Garage Games staff simply not giving a crap. Or at least that's the way they make it seem.
#9
08/12/2012 (7:08 pm)
I don't have a laptop to confirm, but this is what I found:
forums.steampowered.com/forums/showthread.php?t=1733353

It said you must hit "Apply" after making selections. I don't know if this will help or not.

However, after reading about this issue it makes me believe this has more to do with the operating system and drivers than T3D. It looks to me to be a power saving issue. Use the slower less power consuming resource until told to use the high end resource.

Perhaps other video games have found a way to force Direct X to behave or provide some kind of profile for more GPU cards? If T3D is following the directx api for detecting video cards what else can they do? I started looking at this issue in the platform layer, but I am not sure where to look. Does someone else have a clue where to check?
#10
08/13/2012 (12:46 am)
In this thread, Christoper Tauscher, who was a staff member at the time states that Optimus based cards are not supported by T3D. Given that this isn't mentioned anywhere on the product specs, you probably have grounds for a refund.
#11
08/13/2012 (4:58 am)
I'd love to have my money back for T3D.
#12
08/13/2012 (5:23 am)
I did some digging on the optimus card stuff, and this looks relevant(page 16):

www.nvidia.com/attach/3039887.html?type=support&primitive=0

It seems like in order for optimus to natively support a game or platform, it has to be submitted to NVIDIA for testing, after which they push an update to the end-users.

As such, to ensure your particular game runs nicely on it natively, you'd need to submit it to nvidia for review.
Another option I've seen is set up a custom profile for the application in your nvidia control panel, which gets around the problem.

There may be other fixes, but that's what I'd found so far with a cursory glance about the web.
#13
08/13/2012 (5:53 am)
"..custom profile for the application in your nvidia control panel"
Has that really worked for anyone ? ..I've forced it 7 ways until sunday and nada.
#14
08/13/2012 (8:23 am)
Quote:It said you must hit "Apply" after making selections. I don't know if this will help or not.
Hahaha, I have no idea what kind of person wouldn't hit apply after changing a setting... That being said, yes I did hit apply. I set a profile for Torque's .exe in my Nvidia control panel already (which I mentioned in my origional post), and it did not work. It works for other programs but not for Torque3D.

Quote:Christoper Tauscher, who was a staff member at the time states that Optimus based cards are not supported by T3D. Given that this isn't mentioned anywhere on the product specs, you probably have grounds for a refund.
Well that's just fantastic... You would figure that would be listed in the system requirements, or SOMEWHERE before you purchase Torque3D. I always figured that Torque would work perfectly fine with Nvidia Optimus seeing as they support other Nvidia technologies such as PhysX. In fact, that was the whole reason I chose an Nvidia graphics card for my laptop in the first place.

Quote:I'd love to have my money back for T3D.
I too would love to get my money back for Torque3D. Garage Games seems to be going downhill lately and there are tons of laptops with Nvidia Optimus on the market today. Any game I make with Torque3D will not support any person with Nvidia Optimus, which sucks big time. I may as well go back to using TGE and TGEA. I think I may even go try out UDK or Unity as I hear those are some nice game engines. Anyone have any experience with those two vs Torque3D?
#15
08/13/2012 (9:15 pm)
Quote:Hahaha, I have no idea what kind of person wouldn't hit apply after changing a setting...
It would depend upon how well the interface is designed. I have seen lots of interfaces that this would apply to (haha a pun by accident). I am guessing someone was Xing out the dialog not realizing somewhere there was an apply button.

Also, if you read the document that was linked above in this thread then you might realize that it says the driver will look like ONE driver. People have also found that the driver still reports it to be the Intel core even though it has switched to the NVidia core. So even if it says the Intel core it may actually be running on the NVidia core.

So the test would be to adjust the settings of the driver to the Intel core of the app and test performance via FPS or other means. Then change the settings to the NVidia core and test the performance via FPS or other means. It may never change the text of what core it is actually running on.

Maybe it might be a good idea to understand what is in your machine if you are indeed a game developer? That PDF if a gem and describes exactly of what is going on in the driver. Besides I thought it awfully fishy that the Intel core would even be able to run the engine at all...
#16
08/19/2012 (7:22 pm)
For Optimus users please apply this code in wmiVideoInfo.cpp on line 408 in the function WMIVideoInfo::_queryPropertyDxDiag:
Con::errorf("card chipset info: %s", value.c_str());

Post what it prints out. This is the video driver information.
#17
08/19/2012 (8:01 pm)
I found this too:
forums.laptopvideo2go.com/topic/26992-optimus-test-tools-finally-in-users-hands/

Make sure to scan the source. I don't have an Optimus so I cannot test this.

I am researching the proper way to enumerate the Nvidia device in DX9. So far it does not enumerate normally. I need to understand this so when I ship a product I can have a solution rather than just blame the middle ware provider.
#18
08/19/2012 (8:06 pm)
stackoverflow.com/questions/10535950/forcing-nvidia-gpu-programmatically-in-opti...

Wow, it may be you have to contact Nvidia to get a profile included in the drivers for your game! Didn't someone already post that. How exactly would that be done for a middle ware product like T3D? This is looking more and more like a driver issue as there MAY NOT BE A CODE SOLUTION.

I will keep searching...
#19
08/19/2012 (9:32 pm)
Well, like I posted, according to nvidia's paper about optimus, that's pretty much exactly what you have to do. It may have changed since then, but my cursory look about indicated that you still had to talk to nvidia for certification to get it to work natively on the optimus hardware.
I really have no idea how it'd work with middleware. It's likely a specialized situation where you have to contact nvidia to get it setup, etc, etc. Which, if that's the case, I would presume GG would look into that, but it's still a gigantic inconvenience for everyone involved because of how nvidia's set this up.
Let me know if you find anything else Frank.
#20
08/19/2012 (9:43 pm)
I will keep posting here if I dig more up. Thanks for responding back.

I read somewhere as well that there may be a way to create a profile for the NVidia chips, but with the NVidia forums being down I have no way to determine that. I did find a pdf at the stack overflow site that had actual code for enabling CUDA with DirectX and OpenGL. This "might" force it to use the NVidia core. I also found something that said it may be running the NVidia core, but there is no way to tell with DirectX if it is.
Page «Previous 1 2