FSAA and AF in Win32 TGE 1.5.2?
by Jason Parker · in Torque Game Engine · 01/06/2009 (7:00 pm) · 33 replies
I've spent several hours searching for functions to enable FSAA either in the client scripts or from the console. The only function I can find is setFSAA(); and when I tried to use that I get the error "Unable to find function setFSAA()".
I would like to add FSAA and Anisotropic Filtering controls to my graphics options GUI. I can't be the first person to want to do this even though users can force both in their graphics drivers. Fact is, most of my target users don't even install their own graphics drivers, much less know how to access those functions.
I've also tried the function to enable Anisotropic Filtering, and absolutely nothing happens.
I noticed when combing the source code for "FSAA", that the console function setFSAA(); is right after a comment that the following group is Mac specific.
Am I going to have to add in FSAA and AF controls from the ground up, or are my search skills and eyes failing me?
I would like to add FSAA and Anisotropic Filtering controls to my graphics options GUI. I can't be the first person to want to do this even though users can force both in their graphics drivers. Fact is, most of my target users don't even install their own graphics drivers, much less know how to access those functions.
I've also tried the function to enable Anisotropic Filtering, and absolutely nothing happens.
I noticed when combing the source code for "FSAA", that the console function setFSAA(); is right after a comment that the following group is Mac specific.
Am I going to have to add in FSAA and AF controls from the ground up, or are my search skills and eyes failing me?
About the author
Recent Threads
#22
If you have both AF and AA turned off while in fullscreen, turn on AF and hit apply.
Now look at the AA list. It's no longer populated.
Tested in Windowed, and couldn't reproduce the same problem.
I'll try to take a look at it myself and will post if I find anything. Just be warned this will be my first serious debug on anything more than a thousand lines of code or so... (Granted, your work on the previous bug already should narrow it down to start.)
At GG: What are the rules on posting code up for download like above? I have my own domain and hosting. Non-public link with prevention from viewing the file index? Would a folder with a blank index.html suffice? I could also modify the permissions to the folder to control access as well.
01/09/2009 (10:43 pm)
Found a new full screen bug.If you have both AF and AA turned off while in fullscreen, turn on AF and hit apply.
Now look at the AA list. It's no longer populated.
Tested in Windowed, and couldn't reproduce the same problem.
I'll try to take a look at it myself and will post if I find anything. Just be warned this will be my first serious debug on anything more than a thousand lines of code or so... (Granted, your work on the previous bug already should narrow it down to start.)
At GG: What are the rules on posting code up for download like above? I have my own domain and hosting. Non-public link with prevention from viewing the file index? Would a folder with a blank index.html suffice? I could also modify the permissions to the folder to control access as well.
#23
01/10/2009 (12:54 am)
It's late here so I won't be able to take a look at this until tomorrow but if you can take a look at the console after it happens. See if it is correctly printing the available modes after WGL_ARB_multisample. If so then it could be a bug in the consoleDlg.cs file, if not then it is likely a bug in the code checking for the modes.
#24
EDIT:
Here's the clip from the console log, BTW. You'll see that multisample is being enabled, but no modes are listed.
Update:
Just noticed that it's only breaking down under 16 bit color mode. Which also lead me to notice that it is resetting to 16 bit randomly between restarts of the engine. Will have to test further, but this resetting to 16 bit leads me to believe that the following code may be the issue... Please correct me if I'm wrong:
As the comment suggests, 32bit color format appears to be assumed in retrieving the available multisample modes.
01/10/2009 (1:22 am)
When this occurs, no modes are listed under multisample in the console. Guess that means it's in the code. I'll see if I can figure anything out.EDIT:
Here's the clip from the console log, BTW. You'll see that multisample is being enabled, but no modes are listed.
OpenGL Init: Enabled Extensions ARB_multitexture (Max Texture Units: 4) EXT_blend_color EXT_blend_minmax EXT_compiled_vertex_array NV_vertex_array_range EXT_texture_env_combine EXT_packed_pixels EXT_fog_coord ARB_texture_compression EXT_texture_compression_s3tc (ARB|EXT)_texture_env_add EXT_texture_filter_anisotropic (Max anisotropy: 16) WGL_EXT_swap_control WGL_ARB_pixel_format WGL_ARB_multisample OpenGL Init: Disabled Extensions EXT_paletted_texture 3DFX_texture_compression_FXT1
Update:
Just noticed that it's only breaking down under 16 bit color mode. Which also lead me to notice that it is resetting to 16 bit randomly between restarts of the engine. Will have to test further, but this resetting to 16 bit leads me to believe that the following code may be the issue... Please correct me if I'm wrong:
// A 32bit color format to get the number of results down
float fAttribFList[] = {0,0};
int iAttribIList[] =
{
WGL_DRAW_TO_WINDOW_ARB,GL_TRUE,
WGL_SUPPORT_OPENGL_ARB,GL_TRUE,
WGL_ACCELERATION_ARB,WGL_FULL_ACCELERATION_ARB,
WGL_COLOR_BITS_ARB,32,
WGL_ALPHA_BITS_ARB,0,
WGL_DEPTH_BITS_ARB,16,
WGL_STENCIL_BITS_ARB,8,
WGL_DOUBLE_BUFFER_ARB,GL_TRUE,
WGL_SAMPLE_BUFFERS_ARB,GL_TRUE,
0,0
};As the comment suggests, 32bit color format appears to be assumed in retrieving the available multisample modes.
#25
I'll have to continue after letting the mouse charge a bit, since it has a built-in battery.
I'll check back here to see if anyone got to it before me after the mouse has a decent charge.
01/10/2009 (1:57 am)
Okay, I don't know how wglChoosePixelFormatARB works, and my wireless mouse batteries just reached critical while searching for documentation on it so I could figure out how to make this work under 16bit too (though I don't know why anyone would want to use 16bit color in a game these days...).I'll have to continue after letting the mouse charge a bit, since it has a built-in battery.
I'll check back here to see if anyone got to it before me after the mouse has a decent charge.
#26
[PART ONE]
Modification in engine\platformX86UNIX\platformGL.h
Add Support in engine\platformX86UNIX\x86UNIXGL.cc
Add FSAA in engine\platformX86UNIX\x86UNIXOGLVideo.cc:
First check if init failed and reset fsaa:
01/10/2009 (8:47 am)
Here are the modification, i made for unix plattform with MSAA_V2 added before:[PART ONE]
Modification in engine\platformX86UNIX\platformGL.h
inline bool dglDoesSupportMultisample()
{
//OK WE set it to true by default and test it later!
return true;
}
inline int dglGetMSAAModes()
{
return 11; //we say we can 1/2/4 ;) We care about later if it works or not
}Add Support in engine\platformX86UNIX\x86UNIXGL.cc
// Set some console variables:
Con::setBoolVariable( "$FogCoordSupported", gGLState.suppFogCoord );
Con::setBoolVariable( "$TextureCompressionSupported", gGLState.suppTextureCompression );
Con::setBoolVariable( "$AnisotropySupported", gGLState.suppTexAnisotropic );
Con::setBoolVariable( "$PalettedTextureSupported", gGLState.suppPalettedTexture );
Con::setBoolVariable( "$SwapIntervalSupported", gGLState.suppSwapInterval );
[b]
Con::setBoolVariable( "$MultisampleSupported", 1 ); //Hack to 1 !
[/b]
if (!gGLState.suppPalettedTexture && Con::getBoolVariable("$pref::OpenGL::forcePalettedTexture",false))
{
Con::setBoolVariable("$pref::OpenGL::forcePalettedTexture", false);
Con::setBoolVariable("$pref::OpenGL::force16BitTexture", true);
}Add FSAA in engine\platformX86UNIX\x86UNIXOGLVideo.cc:
First check if init failed and reset fsaa:
bool InitOpenGL()
{
DisplayDevice::init();
// Get the video settings from the prefs:
const char* resString = Con::getVariable( "$pref::Video::resolution" );
char* tempBuf = new char[dStrlen( resString ) + 1];
dStrcpy( tempBuf, resString );
char* temp = dStrtok( tempBuf, " x[[6280eb804a855]]" );
U32 width = ( temp ? dAtoi( temp ) : 800 );
temp = dStrtok( NULL, " x[[6280eb804a855]]" );
U32 height = ( temp ? dAtoi( temp ) : 600 );
temp = dStrtok( NULL, "[[6280eb804a855]]" );
U32 bpp = ( temp ? dAtoi( temp ) : 16 );
delete [] tempBuf;
bool fullScreen = Con::getBoolVariable( "$pref::Video::fullScreen" );
// the only supported video device in unix is OpenGL
if ( !Video::setDevice( "OpenGL", width, height, bpp, fullScreen ) )
{
Con::errorf("Unable to create default OpenGL mode: %d %d %d %d",
width, height, bpp, fullScreen);
[b]
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 0 ) ;
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, 0 ) ;
Con::setIntVariable("$pref::OpenGL::multisample",0);
[/b]
// if we can't create the default, attempt to create a "safe" window
if ( !Video::setDevice( "OpenGL", 640, 480, 16, true ) )
{
DisplayErrorAlert("Could not find a compatible OpenGL display " \
"resolution. Please check your driver configuration.");
return false;
}
}
return true;
}
#27
Now again in engine\platformX86UNIX\x86UNIXOGLVideo.cc:
Finally in engine\platformX86UNIX\gl_types.h :
01/10/2009 (8:47 am)
[PART TWO]Now again in engine\platformX86UNIX\x86UNIXOGLVideo.cc:
bool OpenGLDevice::setScreenMode( U32 width, U32 height, U32 bpp,
bool fullScreen, bool forceIt, bool repaint )
{
[b]
S32 fsaa = Con::getIntVariable("$pref::OpenGL::multisample");
[/b]
// load resolutions, this is done lazily so that we can check the setting
// of smCanSwitchBitDepth, which may be overridden by console
if (mResolutionList.size()==0)
loadResolutions();
if (mResolutionList.size()==0)
{
Con::printf("No resolutions available!");
return false;
}
if (bpp == 0)
{
// bpp comes in as "0" when it is set to "Default"
bpp = x86UNIXState->getDesktopBpp();
}
if (height == 0 || width == 0)
{
// paranoia check. set it to the default to prevent crashing
width = 800;
height = 600;
}
U32 desktopDepth = x86UNIXState->getDesktopBpp();
// if we can't switch bit depths and the requested bpp is not equal to
// the desktop bpp, set bpp to the desktop bpp
if (!smCanSwitchBitDepth &&
bpp != desktopDepth)
{
bpp = desktopDepth;
}
bool IsInList = false;
Resolution NewResolution( width, height, bpp );
// See if the desired resolution is in the list
if ( mResolutionList.size() )
{
for ( int i = 0; i < mResolutionList.size(); i++ )
{
if ( width == mResolutionList[i].w
&& height == mResolutionList[i].h
&& bpp == mResolutionList[i].bpp )
{
IsInList = true;
break;
}
}
if ( !IsInList )
{
Con::printf( "Selected resolution not available: %d %d %d",
width, height, bpp);
return false;
}
}
else
{
AssertFatal( false, "No resolution list found!!" );
}
// Here if we found a matching resolution in the list
bool needResurrect = false;
if (x86UNIXState->windowCreated())
{
Con::printf( "Killing the texture manager..." );
Game->textureKill();
needResurrect = true;
}
// Set the desired GL Attributes
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
// JMQ: NVIDIA 2802+ doesn't like this setting for stencil size
// SDL_GL_SetAttribute(SDL_GL_STENCIL_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 0);
SDL_GL_SetAttribute(SDL_GL_ACCUM_RED_SIZE, 0);
SDL_GL_SetAttribute(SDL_GL_ACCUM_GREEN_SIZE, 0);
SDL_GL_SetAttribute(SDL_GL_ACCUM_BLUE_SIZE, 0);
SDL_GL_SetAttribute(SDL_GL_ACCUM_ALPHA_SIZE, 0);
// SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
// SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 5);
// SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 6);
// SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 5);
U32 flags = SDL_OPENGL;
if (fullScreen)
flags |= SDL_FULLSCREEN;
Con::printf( "Setting screen mode to %dx%dx%d (%s)...", width, height,
bpp, ( fullScreen ? "fs" : "w" ) );
[b]
// FSAA
if(fsaa > 0)
{
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLEBUFFERS, 1 ) ;
SDL_GL_SetAttribute( SDL_GL_MULTISAMPLESAMPLES, fsaa ) ;
}
[/b]
// set the new video mode
if (SDL_SetVideoMode(width, height, bpp, flags) == NULL)
{
Con::printf("Unable to set SDL Video Mode: %s", SDL_GetError());
return false;
}
[b]
// Check if FSAA failed or not
if(fsaa > 0)
{
GLint buffers;
GLint samples;
glGetIntegerv( GL_SAMPLE_BUFFERS_ARB, & buffers ) ;
glGetIntegerv( GL_SAMPLES_ARB, & samples ) ;
// Don't fail because of this, but issue a warning.
if ( ! buffers || (samples != fsaa)) {
Con::printf("Warning, FSAA setting failed! (Result: buffers: %i, samples: %i)\n", (int)buffers, (int)samples);
Con::setIntVariable("$pref::OpenGL::multisample",samples);
}
}
[/b]
PrintGLAttributes();
// clear screen here to prevent buffer garbage from being displayed when
// video mode is switched
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
if ( needResurrect )
{
// Reload the textures:
Con::printf( "Resurrecting the texture manager..." );
Game->textureResurrect();
}
if ( gGLState.suppSwapInterval )
setVerticalSync( !Con::getBoolVariable( "$pref::Video::disableVerticalSync" ) );
// reset the window in platform state
SDL_SysWMinfo sysinfo;
SDL_VERSION(&sysinfo.version);
if (SDL_GetWMInfo(&sysinfo) == 0)
{
Con::printf("Unable to set SDL Video Mode: %s", SDL_GetError());
return false;
}
x86UNIXState->setWindow(sysinfo.info.x11.window);
// set various other parameters
x86UNIXState->setWindowCreated(true);
smCurrentRes = NewResolution;
Platform::setWindowSize ( width, height );
smIsFullScreen = fullScreen;
Con::setBoolVariable( "$pref::Video::fullScreen", smIsFullScreen );
char tempBuf[15];
dSprintf( tempBuf, sizeof( tempBuf ), "%d %d %d",
smCurrentRes.w, smCurrentRes.h, smCurrentRes.bpp );
Con::setVariable( "$pref::Video::resolution", tempBuf );
// post a TORQUE_SETVIDEOMODE user event
SDL_Event event;
event.type = SDL_USEREVENT;
event.user.code = TORQUE_SETVIDEOMODE;
event.user.data1 = NULL;
event.user.data2 = NULL;
SDL_PushEvent(&event);
// reset the caption
SDL_WM_SetCaption(x86UNIXState->getWindowName(), NULL);
// repaint
if ( repaint )
Con::evaluate( "resetCanvas();" );
return true;
}Finally in engine\platformX86UNIX\gl_types.h :
#define GL_ALIASED_POINT_SIZE_RANGE 0x846D #define GL_ALIASED_LINE_WIDTH_RANGE 0x846E [b] /* multisample */ #define GL_MULTISAMPLE 0x809D #define GL_SAMPLE_ALPHA_TO_COVERAGE 0x809E #define GL_SAMPLE_ALPHA_TO_ONE 0x809F #define GL_SAMPLE_COVERAGE 0x80A0 #define GL_SAMPLE_BUFFERS 0x80A8 #define GL_SAMPLES 0x80A9 #define GL_SAMPLE_COVERAGE_VALUE 0x80AA #define GL_SAMPLE_COVERAGE_INVERT 0x80AB #define GL_MULTISAMPLE_BIT 0x20000000 [/b] /* * GL_ARB_multisample */ #ifndef GL_ARB_multisample
#28
A little explanation, when the desktop is switched into 16 bit mode dwglChoosePixelFormatARB will no longer return 32 bit pixel formats. The way this function works is you define a set of requirements and the function returns all pixel formats that at least match your requirements, or none. By removing the WGL_COLOR_BITS_ARB we basically told it that we don't care what color format it is in.
Also I think a fix in consoleDlg.cs can fix the problem of switching to 16bit automatically when you set fullscreen.
01/10/2009 (10:06 am)
Good job Jason. You nailed it. Change:// A 32bit color format to get the number of results down
float fAttribFList[] = {0,0};
int iAttribIList[] =
{
WGL_DRAW_TO_WINDOW_ARB,GL_TRUE,
WGL_SUPPORT_OPENGL_ARB,GL_TRUE,
WGL_ACCELERATION_ARB,WGL_FULL_ACCELERATION_ARB,
WGL_COLOR_BITS_ARB,32,
WGL_ALPHA_BITS_ARB,0,
WGL_DEPTH_BITS_ARB,16,
WGL_STENCIL_BITS_ARB,8,
WGL_DOUBLE_BUFFER_ARB,GL_TRUE,
WGL_SAMPLE_BUFFERS_ARB,GL_TRUE,
0,0
};to:// A basic format to get the number of results down
float fAttribFList[] = {0,0};
int iAttribIList[] =
{
WGL_DRAW_TO_WINDOW_ARB,GL_TRUE,
WGL_SUPPORT_OPENGL_ARB,GL_TRUE,
WGL_ACCELERATION_ARB,WGL_FULL_ACCELERATION_ARB,
WGL_ALPHA_BITS_ARB,0,
WGL_DEPTH_BITS_ARB,16,
WGL_STENCIL_BITS_ARB,8,
WGL_DOUBLE_BUFFER_ARB,GL_TRUE,
WGL_SAMPLE_BUFFERS_ARB,GL_TRUE,
0,0
};I just removed the line "WGL_COLOR_BITS_ARB,32," and changed the comment.A little explanation, when the desktop is switched into 16 bit mode dwglChoosePixelFormatARB will no longer return 32 bit pixel formats. The way this function works is you define a set of requirements and the function returns all pixel formats that at least match your requirements, or none. By removing the WGL_COLOR_BITS_ARB we basically told it that we don't care what color format it is in.
Also I think a fix in consoleDlg.cs can fix the problem of switching to 16bit automatically when you set fullscreen.
Quote:Here are the modification, i made for unix plattform with MSAA_V2 added before:Are you able to switch SDL_GL_MULTISAMPLESAMPLES modes successfully? I am currently working on a full unix implementation but SDL seems to ignore changes to SDL_GL_MULTISAMPLESAMPLES after the first time it is set.
#29
The crash was quite specific to going from 640x480 windowed to anything fullscreen. Could not reproduce it with any other windowed resolutions. Also can't reproduce it on the demo I'm working on after merging your code with it even starting at 640x480. EDIT: Spoke too soon, managed to make my demo crash the same way now. heh
01/10/2009 (10:15 am)
Thanks, I never did find that documentation as I got wrapped up with trying to modify the consoleDlg.cs to fix the 16bit problem as you suggest. While doing that I ran into a crash that I thought was related to the new source changes for AA and AF, but after spending a few hours debugging, doubt it.The crash was quite specific to going from 640x480 windowed to anything fullscreen. Could not reproduce it with any other windowed resolutions. Also can't reproduce it on the demo I'm working on after merging your code with it even starting at 640x480. EDIT: Spoke too soon, managed to make my demo crash the same way now. heh
#30
I'am not sure it switch while program is running.
01/10/2009 (10:33 am)
Quote:
Are you able to switch SDL_GL_MULTISAMPLESAMPLES modes successfully? I am currently working on a full unix implementation but SDL seems to ignore changes to SDL_GL_MULTISAMPLESAMPLES after the first time it is set.
I'am not sure it switch while program is running.
#31
I still haven't figured out the crash I mentioned in post #29. It only occurs when going from 640x480 windowed mode to Fullscreen of any resolution so I really haven't considered it a high priority.
As for general testing, I've had users testing on a variety of PCs. It appears to be working properly on a wide range of cards going back to an original GeForce 2. Also had a couple integrated chipsets tested, and the results were as expected. One oddity occurred, but I've determined it must be a driver issue. A user had their MSAA list populate up to 32x! When he chose 32x, his framerate averaged 5 or lower on his GeForce 8600. If he chose it and applied at the same time as changing resolution, MSAA wouldn't turn on at all.
Thanks to everyone for their help in getting this implemented and making this a great post.
02/06/2009 (3:51 am)
Just an update for anyone who cares:I still haven't figured out the crash I mentioned in post #29. It only occurs when going from 640x480 windowed mode to Fullscreen of any resolution so I really haven't considered it a high priority.
As for general testing, I've had users testing on a variety of PCs. It appears to be working properly on a wide range of cards going back to an original GeForce 2. Also had a couple integrated chipsets tested, and the results were as expected. One oddity occurred, but I've determined it must be a driver issue. A user had their MSAA list populate up to 32x! When he chose 32x, his framerate averaged 5 or lower on his GeForce 8600. If he chose it and applied at the same time as changing resolution, MSAA wouldn't turn on at all.
Thanks to everyone for their help in getting this implemented and making this a great post.
#32
02/06/2009 (9:20 am)
thanks, great resource, someone have integrated it with the MK (modernization kit) resource?
#33
In gDynamicTexture.h at around line 220 put this at the end of inline void DynamicTexture::initDT()
In gDynamicTexture.cc around line 28 change:
to:
02/06/2009 (11:28 am)
Try this:In gDynamicTexture.h at around line 220 put this at the end of inline void DynamicTexture::initDT()
mSize.x = 0; mSize.y = 0;
In gDynamicTexture.cc around line 28 change:
else if(eventCode==TextureManager::CacheResurrected)
{
GBitmap *bmp = new GBitmap( ts->mSize.x, ts->mSize.y, false, GBitmap::RGBA );
ts->mTextureHandle = new TextureHandle( NULL, bmp, BitmapKeepTexture, true );
}to:
else if(eventCode==TextureManager::CacheResurrected)
{
if (!(ts->mSize.x == 0 && ts->mSize.y == 0))
{
GBitmap *bmp = new GBitmap( ts->mSize.x, ts->mSize.y, false, GBitmap::RGBA );
ts->mTextureHandle = new TextureHandle( NULL, bmp, BitmapKeepTexture, true );
}
}That will fix the crash in post #29
Torque 3D Owner Jason Parker