Game Development Community

Torque likes it hot!

by James Dunmow · in Torque Game Engine · 12/08/2004 (5:59 am) · 22 replies

Just thought I'd throw this out to see if anyone else was having a similar experience with the Torque engine...

Namely...whenever I run Torque...whether stock, modified, the RTS kit or whatnot my CPU and videocard temperatures go through the roof. It seems to make my system run much much more hot than even a new and supposedly graphically intense game.

eg.

Running an AMD XP 3200+ 32bit with a Sapphire Radeon 9800XT

My machine sitting idle: CPU Temp - 45c, GPU Temp - 56c

Fire up Torque. Use for 10min or so: CPU Temp - 56c, GPU Temp - 67c

Fire up Vampire: Bloodlines(Halflife2 engine), play for 30mins - CPU Temp 51c, GPU Temp - 61c

Anyone else notice anything similar? Anyone know why Torque is such a beast? Nothing else I run makes my hardware reach these temperatures.
Page «Previous 1 2
#1
12/08/2004 (6:24 am)
Nope, cold as ice here..
#2
12/08/2004 (7:12 am)
I had a similar problem on my prescott 3.0 p4. The cpu reached 'round 70c and the comp restarted.
I bought a (much) better cooling solution wich helped alot.

FYI, I basically get the same results in Tribes2 so I guess it's related to some early code inside TGE...
#3
12/08/2004 (9:02 am)
I'd be interested in finding out what's triggering this. :) Are you running a release build?
#4
12/08/2004 (9:23 am)
I was running a release build of the RTS kit in windowed mode (1280x1024/32bit) when I saw the above example I used. But I've seen it happen to a lesser extent with the plain old TGE build. I'm wondering if it has anything to do with running in windowed mode and running other apps at the same time eg. having a code editor, firefox, email program etc etc..

So it may very well be self inflicted. I'll fire it up by itself in fullscreen/windowed and solo/multitasking and see if makes any difference.
#5
12/08/2004 (10:58 am)
Cool.
#6
12/08/2004 (1:15 pm)
And the results are in! Not the most exciting afternoon I've ever spent, but here we go...

Test system

Asus A7V600
AMD XP 3200+
Radeon 9800XT
2x512meg Samsung DDR/400mhz
Catalyst 4.12 Beta Drivers
Directx 9.0c SDK (October Update) running in Release mode

All tests done with OpenGL 32bit, no AA/AF or Vsync, High Quality settings, Stock build unless noted otherwise.

Fired up Torque. Let sit 10min, neither shaken nor stirred ;)

All temps in centigrade first # is CPU temp, 2nd is GPU temp.

-----------------------------------------------------------------

RTS Kit 1600x1200 Release/FullScreen 54/66

RTS Kit 1280x1024 Release/Window 53/63

Torque 1.3 1280x1024 Debug/Window (Stock FPS Starter Level) 51/61

Lighting 1.3 1280x1024 Debug/Window (Nighttime village) 51/60

RTS Kit Lighting Pack installed 1280x1024 Release/Window- 50/61

RTS Kit /w Lighting Pak 1.2 integrated 1280x1024 Release/Window - Big Volume Light in Scene - 54/65

TSE Debug 1280x1024 - Debug/Window - Funky per vertex/pixel spheres - 51/63

RTS Kit Release 1280x1024- Release/Window - Camera Straight Down/Max zoom in on terrain&units 54/67!!! - RTS Units Removed - 53/66 - Left running in background - 48/60

Light Pak 1.2 - Goo Example 1280x1024 - Window/Debug- 53/63

RTS Kit 1280x1024 - Debug/Window - Camera Straight Down/Max zoom in on terrain&units - 50/60

RTS Kit 1600x1200 - Debug/Fullscreen - Camera Straight Down/Max zoom in on terrain&units - 50/61

--------------------------------


So, unless I'm missing something (and this was by no means a scientific temp benchmark, heh) the problem actually seems to be with the RTS Starter kit, and not Torque as I'd assumed. Shall we move this discussion to the RTS forums?
#7
12/08/2004 (4:31 pm)
I have a Sony laptop, and whenever I run Torque, even at the menu, it kicks in the fan and overheats, messing with 3d display (2d stays fine). Mind you, pretty much any game does this. And to think, I bought the laptop to play games on...... I knew I shouldnt have bought it with it having an main SiS chipset, even if it was the best laptop with a 16" screen and fx5600go.

Of course I could just move to the arctic circle, it should stay cool enough to work then....
#8
12/09/2004 (11:49 am)
Mark sounds like you have some bad memory, I have the one with the 17" screen and a ATI Radeon 7500 mobile, it plays games fine, try different memory
#9
12/09/2004 (12:00 pm)
I have a P4 3.0 with a GF FX6800 GT and the temps stay nice and cool... except when I fire up Doom 3.

- Brett
#10
12/10/2004 (6:35 am)
This is a scary one...

Cranked open stock Torque. Fiddled with game for 10mins. Cpu - 51, GPU - 62
Cranked open GUI Editor (f10) Fiddled with editor for 10mins - Cpu - 53, GPU - 61
Exited mission.
Opened GUI Editor again from main menu, let sit 5-6 mins- Cpu - 55, GPU - 65
Opened PlayGUI in Editor - displaying blank teal GUI - CPU 56, GPU - 68 (!) Did not need to let this sit for long. Temp increase was rapid on the GPU.

First noticed this playing with the GUI editor in the RTS Kit then went to check if stock Torque did the same thing. It does.

Having had the Torque window with the GUI editor open on the same page for the length of time it took me to write this message (3mins or so) my CPU/GPU is back to 48/60.

PS - This is not a system cooling problem, as I'm hard pressed to get -any- other app or game to push my system to 51-53degs on the CPU and 62-63 on the GPU even over long periods of use.

PPS - Tested this again and it took a whole 4mins for CPU to go from 45->54 and GPU to go from 59->68. Also, as a side note when I noticed this with the RTS Kit it was a release build. When I tested this on stock Torque it was debug. So it doesn't seem to make a difference.
#11
12/10/2004 (9:56 am)
How does Doom3 treat your system?
#12
12/10/2004 (10:54 am)
I un-installed it awhile ago, but I certainly don't recall getting anywhere near these sorts of temperatures with it - even back in August when the ambient temperatures would have been higher in the summer. Vampire: Bloodlines doesn't break 60-61 deg on the GPU. But, if there's something particular about D3 that you'd like me to check out to compare I can load it back in and fire it up.
#13
12/10/2004 (12:25 pm)
Huh. Well it comes close. 20mins of Doom3 on Ultra settings 1024x768 brings the GPU to 65deg and the CPU to 51.
#14
12/11/2004 (9:34 pm)
Hey I found the problem.
Change this line (line 37) in the script file in common/client/client.cs:

$prefs::hardware::gpuOverheat = true;

Change this to false. I did it and now my CPU and GPU temps go down when I run Torque!

Guess this setting must have been an oversight. Good thing it was so easy to fix. : )
#15
12/11/2004 (9:37 pm)
BTW, my previous post was a joke to anyone who cannot find that setting.

I mean, seriously, can we really be expected to worry about CPU/GPU temps on hardware? The more features you use of a piece of hardware the more work it does. Hence the hotter it will get. Either reduce the work, get a better engine, or add a better cooling system.
#16
12/11/2004 (10:20 pm)
So I should not only ignore what might be causing this, but also ignore Ben's request to help him find the cause of this? I don't expect you to be worried about CPU/GPU temperatures at all. If it doesn't concern you, well...it doesn't.

But me, I get curious when something acts contrary to my assumptions. I try to find out why. When a plain GUI editor console makes my GPU overheat more than Doom 3 I cannot help but ask myself 'what on earth might be causing this?'.

The temperatures aren't critical at all. The GPU can take over 100C. The cooling system is fine. I'm not concerned at all for my hardware. But when something generates more heat than I expected, you're quite right, I try to find a way to make it do less work and generate less heat by doing so. Do achieve this I have to understand what sort of work the program is doing to generate the heat on the GPU. A simple GUI editor shouldn't do this, I wouldn't think.

Maybe it's not Torque. Maybe it's the ATI beta drivers. Maybe it's OpenGL. But anyway...I digress...it could be a lot of things, and that's not really the point. I like mysteries and solving problems. Big ones, small ones, whatever. If this particular one piques your interest keep on reading. If not, well, there are lots of other threads out there to read ;)
#17
12/12/2004 (2:40 am)
@Jarrod

It was the first laptop available from Sony with the diamond black screens, and it seems to hapen to all of them, I got mine, and two other people I know got them too. It happens to all of them, so I'm pretty sure its a design flaw. It could be a bad batch of video ram, but its unlikely.
#18
12/12/2004 (3:45 am)
Ultra on Doom3 will make the temperature go up, don't worry.
#19
12/12/2004 (5:07 am)
Hehe... Was gonna say... FarCry makes my machine spike over 60C, but I guess that's what happens when you put shaders on foliage billboards. :)
#20
12/14/2004 (5:34 pm)
Okay, the gui editor probably is not optimized and uses lots of calls to send triangles or other shapes through the OpenGL pipeline. Maybe using display lists or other such techniques could reduce either the traffic or the use of only triangles to speed up rendering. This might reduce the number of widgets in the silicon that are being used. Let us for discussion consider a "widget" to be something like a part of the silicon that handles a specific block of the logic in the GPU that handles some task. Let us also assume the specific widget is not important, but the number of concurrent tasks taking place in the silicon is. If we reduce the number of different widgets by only using triangles in a scene (a scene being what is on the screen) you should reduce the silicon being used and thus the heat. Okay, now let us assume that each widget has a heat value with the most optimized widgets being the least heat producers. (Why optimize electrically widgets that are not oft used?) Then the code should use the most optimized widgets.

Here is a list of widgets to look at (please, opengl gurus jump in define widgets that are not mentioned here):

1. data transfer from the CPU to the GPU, should cause an increase of widgets used in both the GPU and CPU
2. only sending data once, display lists?
3. don't overrun the texture ram repeatedly, probably will cause 1
4. allow settings that reduce the overhead of the game rendering, scalable rendering? like in the Q3A options
5. produce a warning that says your GPU is hot, get a better card?

Hopefully that will give you an idea of what might be happening in the card. Also, the more features of the GPU/CPU you use the hotter they will run.
Page «Previous 1 2