Game Development Community

Accessing the Backbuffer efficiently

by William Todd Scott · in Torque Game Engine Advanced · 11/02/2006 (6:40 pm) · 5 replies

Hi,

I am integrating a post-processing system into TGEA and have a question on accessing the backbuffer.

Specifically, what I would like to do is set the backbuffer as a texture to sample without copying it or creating another render target. I noticed that the DRL code deals with this issue by creating a new render target and the the refraction code COPIES the backbuffer over to another texture.

What I would like to do is let the engine render as usual, then set the backbuffer as a texture to sample, call pushActiveRenderSurfaces() to set a different surface as the target, and then start post processing.

So, my question is, can I do this? I don't see anyway of getting a handle to the backbuffer so that I can set it as a texture. I definately do NOT want to copy the backbuffer because that is too slow. I could simple set my own texture before the engine renders everything and use that (as in DRL code), but I don't think that should be necessary.

I am thinking of adding a method to gfxRenderDevice() to get a pointer to the backbuffer by returning the gfxTextureObject pointer in mCurrentRTData. Does this seem reasonable?

Thanks
Todd

#1
11/03/2006 (12:15 pm)
TGEA already does a copy of the backbuffer every frame. Call GFX->getSfxBackBuffer() to get a hold of it.

That copy is used for refraction effects, and you can access it in your custom materials by setting a texture layer to "$backbuff".
#2
11/03/2006 (12:46 pm)
Hi Manoel,

I thought that copy was only done if we had materials that needed it. After reading your post I went back and, sure enough, that copy is being done every frame --- need it or not. Thank you for clearing up my understanding on that bit of the code.

I am not going to use that buffer, because I can see a day when that gets replaced with something more efficient (e.g. only do it if you need to).

After more digging (and your help), I am going to do the same thing that the DRL code does, which is to allocate a render target for the post-processing framework. I believe that this is the most straightforward thing to do given the code base.

The downside of this is that, I don't believe there is a need to allocate yet another render target (for the system I am talking about or the DRL system). If RenderDevice were extended to have a getLastRenderTarget() function that returns a GFXTexHandle to the current back buffer, then we could use that last render target without allocating another one or copying it when a copy isn't necessary. (That function could also call pushActiveRenderTargets() so that someone can't try reading & writing to the same target, or it could be left to the programmer to use it correctly.)

Thanks for the help (with this and the many other silly questions I have posted).
Todd
#3
11/04/2006 (2:15 pm)
Well, ideally all processing effects should be written into the same post process pipeline so that they all share common buffers where possible. Dunno if thats anywhere on the horizon or even in the current code. I am reasonably sure I saw something along those lines as I was digging through the code for some of our post processing stuff.
#4
11/04/2006 (2:44 pm)
Hey Phil,

That is what I am suggesting, but doing it in a way that the buffer is easily accessible, thus making modifications of stock code less intrusive. The codebase already has the ability to push/pop the render target stack, adding a getlastrendertarget() funciton would complete the functionality.

I'm just thinking out loud here. The code is really solid and the more I dig into it, the more convinced I become that this functionality would be straight forward to implement.

Todd
#5
11/06/2006 (9:45 am)
Anyway, you'll need to use another texture, either by making a copy of the backbuffer or rendering directly to a texture. There's no way you can use the backbuffer as a texture and draw to it at the same time.

The most you can do is get a pointer to the backbuffer surface, I'm not entirely sure if you can setup a texture object using that surface (it might be possible), but even if you do it, you would be unable to use it as a texture unless you're drawing to another render target, since Direct3D prevents such thing (the same buffer being used as texture and render target at the same time).

I was going to roll my own post processing effects by drawing the scene to a texture instead of the backbuffer, but I ended up using the refraction buffer without problems.