Game Development Community

1.7.2 Not reading my PC correctly

by Nicolai Dutka · in Torque Game Builder · 04/15/2008 (11:15 am) · 18 replies

I've been developing a game on my desktop for a few weeks now and it is coming along really nicely. I tried out the project on my laptop and it ran like complete crap. All I am doing is changing a picture 30x per second:

schedule(33,0,runThis)

And it chops REALLY bad, takes 10 seconds to play 1 second of images and only shows TWO of the images (getting stuck on each).

I thought I'd check out the console log and found that TGB is reading the laptop as a "Unknown Pentium Pro II/III 1.4ghz"..... Uh.... my laptop is a Pentium M 1.98ghz.....

So wtf is going on? How can I fix this? I need this game to be able to run on lower end machines and honestly, updating an image 30x/second... my cell phone can do that...

I can play Elder Scrolls 4: Oblivion on this laptop (low settings, but it works fine and looks decent), but I can't run a simple 2d top-down shooter? I mean, even TGE 1.5.2 runs my 3d games just fine on the laptop, why is TGB any different?

What can I do??

PS: Once I get past the menu with the changing images (flip book style :P ) the rest of the game runs fine and I have stuff scheduling to run up 100x/second and I have multiple ones on screen at a time. So why only run crappy when chaning JPG's??

#1
04/15/2008 (11:32 am)
I just went and tested the game on my kid's PC (P4 1.8Ghz) and it reads it properly (console says P4 1.8ghz) but the display is completely screwed. All I see are a bunch of colored blocks. Like when you put in an old NES game and the dust particles cause the graphics to mess up. I know their video card isn't all that great, but seriously, how good of a video card is needed to display a jpg???

I brought the game upstairs and tested on my mom's old PC. It's a P3 1ghz with almost nothing for a video card. The game would not even come on. It just crashes right away. BUT, the console does say it is reading the CPU correctly as a P3 1ghz.

So does TGB have issues reading the Pentium M, or is it just MY laptop it is having a problem with?
#2
04/15/2008 (12:42 pm)
Well, I just noticed it also is not reading my desktop correctly, but yet it will still run just fine on that. It says for the desktop:
"Unknown Pentium Pro II/III 3.05ghz"

This also isn't true becuase I am running a Pentium QX6850 Quad Core, so it should be seeing FOUR processors at 3.05ghz each. I assume TGB doesn't support multiple cores then, but this still does not explain why my laptop wont show JPG's at 30 frames/sec....
#3
04/17/2008 (9:58 am)
Are you updating the GUI this way? If so, why not use a Theora control instead?
#4
04/17/2008 (10:52 am)
I did switch to using a Theora control a few days ago and it is working ok, but not great.

I get a white flash at the beginning of the video which seems to last longer depending on how much system resources are available and how powerful the PC is. I masked it for the most part, but still see it sometimes.
#5
04/17/2008 (10:59 am)
schedule(33,0,runThis)

This is a really really bad idea. Any time you are driving such high frequency updates from script, to script, you are going to run into performance issues--it takes a large amount of cpu cycles to set up and tear down parser execution stacks, especially when many of these high frequency updates are already available to you on various object types, and driven by the engine (c++) instead of script.

We can't really tell what exactly you are wanting to do, but as David mentions, there are probably much better techniques for what you want to do, so if you could explain a bit about what your goal is, we can probably point to a better technique :)
#6
04/17/2008 (11:04 am)
That schedule is set under an if statement which controls how many times it can run, so it won't go for more than 1 second.

I am making a fully 3d menu in Torque 2d using frame by frame renders from 3d Max.

I did go ahead and change to using a Theora control. Performance is a million times better, but quality isn't nearly as good. Please see my previous post about white flashes.

Also note that no matter how high quality I render the video in Max, the "AVI to OGG" converter ALWAYS reduces the quality. Most specifically, I lose my "super black" background for a more faded black and the rest of my colors are kind of blurry and pixelated (just a little).
#7
04/17/2008 (11:07 am)
Oh yeah, I've used a "schedule(10,0,runThis)" before for updating objects (visibility, position, size, etc) and even my mom's old crappy PC can run it just fine. It only seems to be a problem when the schedules actually do "setBitmap" because it has to load the 30 bitmaps into memory per second. If I reduce the image quality, it works ok, but then looks like crap.

In a give and take world, Theora is working best....
In a perfect world, my method would be best ofr quality....
#8
04/17/2008 (11:13 am)
Based on what you've said so far, my guess would be that your performance issue is sending the textures to the video card. It sounds as if you are kicking up 30 different textures a second from script, and based on your statements of "quality", I'm guessing those are pretty big textures (what is their size? 1024x1024? 256x256?).

A better strategy might be to push the textures (after optimizing them for size of course) into a texture sheet, then reading it into the engine as an animated sprite.
#9
04/17/2008 (11:24 am)
Exactly right Stephen, they are 800x600 jpg's reduced to 60% quality for a total of 50kb each. They HAVE to be full screen images, so making them smaller and stretching them to fit looks REALLY bad.

I don't know anything about a texture sheet, unless that is the same thing as a cell image? If so, what is the maximum size I can use and what is the maximum size you'd recommend using?

I did think about adding the images one at a time into a datablock and making them into a linked image, but that seemed very time consuming and redundant in code when I am working with 450 images....

If I could get around the 'white flash' and if I could get the 'AVI to OGG' converter to do better quality files, I think that would be the best route.

Let's not forget, 450 jpg's, even at 50kb each, totals over 22mb whereas the OGG files total more like 2mb...
#10
04/17/2008 (11:26 am)
I also tried adding the 450 images individually to the GUI file.... dont try that. :P

I figured it would force the images to preload into video memory so they would display smoothly and quickly, but it ended up making the game freeze when you turn it on.
#11
04/17/2008 (11:35 am)
Yeah, with that number of textures, you absolutely without a doubt don't want to be pushing them to your video card via script, and also don't want to be pushing them individually.

I can't help much with the AVI to OGG converter stuff (never messed with it myself), but it would probably be your best bet. Alternatively, if you have a source code license (I assume you do not), you could integrate a Flash/SMF interface and let that library take care of that side of things.

Please take this in the right way: you are definitely aiming quite high (3D Gui, in a 2D engine--wow!), and I'm reasonably confident there is a solution that will work for you in the long run, but it may take quite a bit of effort to get things perfect for you.
#12
04/17/2008 (11:39 am)
I absolutely AM aiming high. This project is supposed to display a culmination of everything I have learned at Brown College in the last 4 years. It is my final project (I graduate in June!!) and I want it to be nothing short of my very best.

I actually really do have the menu looking pretty good at this point, especially considering the challenge. I think I will keep what I have (Theora and OGG), but look for better quality converters. Can the Theora play any other file types than OGG?

Thanks for all the help! Next stop: Networking!
#13
04/17/2008 (11:45 am)
I thought it was interesting to note this and I am wondering why:

When I put the 450 images (totalling 22mb) in the GUI file, my RAM usage goes to a little over 500mb. Why would it takes 500mb of RAM to load 22mb of images into my video card? (RAM stays at 500mb+ usage until i shut off the game. using OGG, game totals under 34mb RAM. I have no idea how much video memory is being used in either case...)
#14
04/17/2008 (11:51 am)
Well, for one, video cards want textures in powers of 2, so your 800 by 600 textures are getting inflated to 1024x1024 (that right there explains the 450m of memory use plus/minus).

In addition, if I remember correctly, TGB is also generating mips for your textures (I could be wrong on this, has been a while since I looked at it) so that the video card can reduce the texture as appropriate, which also adds to the actual texture size being stored in video memory, and then sent to the video card.
#15
04/17/2008 (11:53 am)
Ok cool. That all makes sense.

Now... I need to get better quality OGG files.

Can Theora controls play any other file types?
#16
04/17/2008 (1:54 pm)
Can we get some screen shots of what you're trying to do? I've tried doing 3d menu's in 2d before in Shockwave back in '99 & in Doom 3's GUI system. Depending on what you're doing, there could be very efficient ways to do it, but normally it's a lot more work then just playing back video frames.
#17
04/17/2008 (1:58 pm)
They are also not compressed images when loaded into memory.

Theora cannot play other formats.
#18
04/17/2008 (2:03 pm)
I was going to use 3d shapes and script the gui, but the "Hair & Fur" in Max will not export with the shape file.

I'll provide a clip of my video when I get some time, might take a day or two.