Game Development Community

CPU versus GPU?

by Daniel Buckmaster · in Hardware Issues · 10/24/2009 (2:50 am) · 6 replies

I know roughly what the CPU and GPU are for, but how do they interact when it comes to games? For example, if I have a hotshot CPU, but an old GPU, presumably I would get low framerates - but is the game actually running any slower, for example in the case of remote clients who might be connected to my computer as a server? Would they experience problems if my framerate was low?

Conversely, with a great GPU but a bad CPU, would it be possible to get great frame rates? - but what would happen to the actual performance of the game if the CPU wasn't up to the task?

About the author

Studying mechatronic engineering and computer science at the University of Sydney. Game development is probably my most time-consuming hobby!


#1
10/24/2009 (5:37 pm)
It depends.

An older game, fewer fancy shaders (or none at all) will tax the CPU more than the GPU, and might not even require all that much CPU.

If you can't find a level of graphics settings where the game can keep up, the game logic might also run slower. The game thinks it has to render a certain number of frames, and uses the time between screen refreshes to move players and AI or update the mouse (hardware mouse pointer FTW!).

If your GPU is shit-hot and your CPU should be in a museum, your GPU is barely doing anything while still rendering whatever it gets instantly. But the game logic is lagging behind, and you can still have perceived framerate lag.

The more moving objects on the screen, the more both CPUs and GPUs work.

So if your system is slow one way or another AND the master server for the match, your players would suffer. Their symptoms might be different than yours, and could range from ghost objects or rubber-banding to disconnections.
#2
10/24/2009 (9:56 pm)
its prefer have a good CPU and a low spec GPU, in the real life you need a balance.
#3
10/25/2009 (6:19 am)
You need a good balance of both, how much of each really depends on the game itself though some are very graphically challenging (high poly counts, shaders, dynamic shadows, lots of textures, etc) and others a very cpu intensive (very detailed AI, physics, etc).

If as you mention you have a great CPU and a bad GPU then you'll quite probably find Torque spending a lot of time in the "SwapBuffers" area if you use the profiler (that's in TGEA, can't remember the name in TGE) - basically meaning the game is waiting for the GPU to catch up before moving onto the next frame - so yes any clients connected to you will Lag also as the game will run slower.

If that's the case you're better running a dedicated server on your pc and then you also connecting as a client to it, especially if you have a dual core or better CPU. Typically modern games are more GPU intensive than they are CPU intensive so that's where I'd opt for more investment.

#4
10/25/2009 (6:29 am)
Okay, thanks all for the info. I had guessed as much - I just got thinking about this with the MW2 dedicated server debacle. People were talking about the crappiness of P2P hosting (which apparently now means client/server with one player as the server).

Quote:If that's the case you're better running a dedicated server on your pc and then you also connecting as a client to it, especially if you have a dual core or better CPU.
So in this case, the instance of Torque running as the server wouldn't get stuck in SwapBuffers, even if your client instance might? That makes sense, and I'll keep it in mind.

Quote:Typically modern games are more GPU intensive than they are CPU intensive so that's where I'd opt for more investment.
Since you mentioned it, I did just get a 9800GT... hopefully that'll hold up for a while ;P.
#5
10/25/2009 (8:20 am)
Quote:I just got thinking about this with the MW2 dedicated server debacle

>:(
MW2, No dedicated Servers, No Mods and $10+ dollars more(translation: console version)... No further comment.
#6
10/25/2009 (9:11 am)
Play ArmA 2 instead ;). Bohemia Interactive knows what PC gamers like.