Game Development Community

Wetness

by Joshua Horns · in Torque 3D Professional · 08/04/2009 (7:49 am) · 125 replies

Rumor has it the wetness effect won't be in the release, so I'd like to think about making one of my own.

To that end I'd like to view the wetness demo that was displayed a few months back but it appears to have been pulled. Is there any way to put that up on the site?

** NM found it on Gerhard's blog.
Page «Previous 1 2 3 4 5 6 7 Last »
#1
08/04/2009 (8:56 am)
If you want a good starting point, pretty much ignore what he goes over on his blog regarding MRT's, and start with PostEffects. In a PostEffect you have access to the world-space normal, position, and lighting information. This actually gives you all you need to do the same effect.

An object appears "wet" (usually) due to a color change, and a specularity change. The light buffer contains specular information (for all objects. Specular is masked/exponentiated in the forward pass), and light color. The color of a specular highlight on water is dependent on the light/water color, not as much the color of the material the water is on.

The last bit of the effect in the video is the scrolling texture to simulate water running off surfaces. In order to do this, construct a texture co-ordinate using the XY parts of the world space position, then animate it using the 'accumTime' shader constant, and computing a "downhill" vector.

I am computing "downhill" by doing:
side = cross(normal, (0, 0, -1))
downhill = cross(normal, side)
Which works in most cases but seems to be reversed on others. I haven't looked further into this to see if it's the result of inverse UV mapping, or what.

The other issue with this is that I have found I don't actually want the detail normal of the surface, I just want the polygon normal. Having the bump-map data really kind of messes things up. The answer may be to downsample the normals in the GBuffer, I haven't poked at it in a little bit.
#2
08/04/2009 (9:20 am)
Thanks for the primer!

Quote:
1. side = cross(normal, (0, 0, -1))
2. downhill = cross(normal, side)

If the normal and down vector (0,0,-1) were co-linear this would break (normal is pointing straight up or down). Other than that it should work... very similar to how a camera frame of reference is created with view and up vectors, but you know that I'm sure. I like it.
#3
08/04/2009 (9:59 am)
Quote:If the normal and down vector (0,0,-1) were co-linear this would break (normal is pointing straight up or down). Other than that it should work... very similar to how a camera frame of reference is created with view and up vectors, but you know that I'm sure. I like it.

True...but then if you had a co-linear normal that would mean the ground is flat, and shouldn't have the water "scrolling" anyways (no gravity pulling it downhill) ?

Of course that might not give as good of an effect or play into flat ground that leads into a downhill section where the water would flow together I suppose...
#4
08/04/2009 (10:53 am)
You are both right. You can avoid that case with a dot() check if you really want to, but like Bryan said, it doesn't much matter.

This is a very early shot which is taken before I was creating texture co-ordinates from world-space position. This is why things "move" when the camera moves. The implementation with tcoords derived from WS position doesn't do that.

As you can see...it's mostly right. It needs more work, but I think that the concept is solid.
#5
08/04/2009 (1:04 pm)
@Josh and Pat
If either of you get this working, please post as a resource. I think it would be an fantastic resource to add to the great resources already!
#6
08/05/2009 (3:40 am)
I have a couple more questions.

Quote:In a PostEffect you have access to the world-space normal, position, and lighting information.


struct PFXVertToPix
{
   float4 hpos       : POSITION;
   float2 uv0        : TEXCOORD0;   
   float2 uv1        : TEXCOORD1;    
   float2 uv2        : TEXCOORD2;
   float2 uv3        : TEXCOORD3;
   float3 wsEyeRay   : TEXCOORD4;
};

Where are the rtparams defined and which are mapped to these?

Does texture[#] = #SomeBuffer resolve to an rtParam?

AND

Texture[#] = $SomeMap resolve to a sampler?

Where are things like #prepass, #lightinfo defined... how do I know what the texture buffers are that I can include? Is there a list? Is there a "specBuffer"?

I'm guessing things like #diffuseMap and #specularMap (don't remember the exact terms) correspond to the editor's mapping options. Then you have $bumpMap, $refractMap...

To sum up this question (it's early so forgive me if it's not coherent):

How do I access the normal, specular ... etc buffer data from the postEffect? Is there a file or some place in the code where the various #"buffers" are defined or explained?
#7
08/05/2009 (9:55 am)
The #prepass buffer is created by the PrePassRenderMgr, and the #lightinfo buffer is created by the AdvancedLightBinMgr.

To get the normal, you need to bind "#prepass" to a texture, #include "shadergen:/autogenConditioners.h", and then use 'prepassUncondition' in the shader. To get specular information, you need "#lightinfo" and 'lightinfoUncondition'.

This functionality really isn't described anywhere. There are several examples in PostFx though.
#8
08/05/2009 (10:36 am)
Thanks again for the quick response, that helps explain things.

Quote:This functionality really isn't described anywhere. There are several examples in PostFx though

I seem to maintain other people's code for a living so I get a little lazy when I'm off work in so far as researching exactly how the code behind is operating. (praying that these sorts of things are documented at release).

Again, big help. Thanks for the clarification.
#9
08/05/2009 (11:47 am)
Yeah the whole conditioner/unconditioner and named-buffer system is kind of complicated, but it really makes a lot of sense once you get it.

I need to clean up my shader and post effect, and then I will post it for reference. I tried downsampling the g-buffer for normals, and that (unfortunately) resulted in what I thought it would...low-resolution high-frequency normal data. I need to think about it some more, but if there isn't a way to either downsample, or extrapolate the surface normal (and not the detail normal) than I think this will take another render pass, or MRT. I think that a low-res render pass is preferable to MRT, but that really depends on scene complexity, target requirements, and other factors.

What I am thinking about now is using the normal to sample from a low-res cube/paraboloid map which essentially serves to "fit" the normals into a few different discrete areas.
#10
08/05/2009 (12:27 pm)
Maybe if you took a gaussian distribution of the detail normals and the normals of adjacent fragments, the same way blurring is done, you can get a reasonable approximation of the surface normals.
#11
08/05/2009 (1:27 pm)
Have you tried first derivative of gbuffer world positions (ddx, ddy) as a crude approximation to world normal?

Caveat: I have flu, so my brain is not working well and this might be a silly idea...

#12
08/05/2009 (7:18 pm)
I really enjoyed this discussion even though it is way over my head!

@Pat from the little bit I could follow, the problem you expressed in post #9 may be related to the frequency exceeding 2x the the mesh edge length.

This is a link to a very through treatment of shader water effects from GPU Gems:

http.developer.nvidia.com/GPUGems/gpugems_ch01.html

From GPU Gems:
"1.3.3 Edge-Length Filtering
If you are already familiar with signal-processing theory, then you readily appreciate that the shortest wavelength we can use to undulate our mesh depends on how finely tessellated the mesh is. From the Nyquist theorem, we need our vertices to be separated by at most half the shortest wavelength we are using. If that doesn't seem obvious, refer to Figure 1-8, which gives an intuitive, if nonrigorous, explanation of the concept. As long as the edges of the triangles in the mesh are short compared to the wavelengths in our height function, the surface will look good. When the edge lengths get as long as, or longer than, half the shortest wavelengths in our function, we see objectionable artifacts.


Figure 1-8 Matching Wave Frequencies to Tessellation

One reasonable and common approach is to decide in advance what the shortest wavelength in our height function will be and then tessellate the mesh so that all edges are somewhat shorter than that wavelength. In this work we take another approach: We look at the edge lengths in the neighborhood of a vertex and then filter out waves that are "too short" to be represented well in that neighborhood.

This technique has two immediate benefits. First, any wavelengths can be fed into the vertex processing unit without undesirable artifacts, regardless of the tessellation of the mesh. This allows the simulation to generate wavelengths solely based on the current weather conditions. Any wavelengths too small to undulate the mesh are filtered out with an attenuation value that goes from 1 when the wavelength is 4 times the edge length, to 0 when the wavelength is twice the edge length. Second, the mesh need not be uniformly tessellated. More triangles may be devoted to areas of interest. Those areas of less importance, with fewer triangles and longer edges, will be flatter, but they will have no objectionable artifacts. An example would be modeling a cove heavily tessellated near the shore, using larger and larger triangles as the water extends out to the horizon."



#13
08/05/2009 (8:31 pm)
Dean I think you may be on to something. Up until now I have been thinking about this from the perspective of making the existing Gbuffer normal work for this purpose; you are proposing constructing a new approximation using world-space position. I like this idea; I will need to mess with it.

Anuther1: I think that is solving a different problem, but is still interesting.
#14
08/06/2009 (3:17 am)
Clearly I don't understand the problem.

Here is what I think it is:
You have a normal in world space (normal is from a bump map, not surface).
You want a normal in world space (surface normal, not from bump map).

Once you have that, in order to scroll the texture Down( in world space) you want to determine the down vector relative to the object surface.

Am I even close?
#15
08/06/2009 (6:19 am)
Quote:Have you tried first derivative of gbuffer world positions (ddx, ddy) as a crude approximation to world normal?

Bear in mind I have no idea how the GBuffer (deferred shading?) works when reading what I'm about to write =.

Is ddx, ddy data that's held in one of the G-Buffer render targets? if so, is there a ddz? Does it store partial derivatives with respect to x,y,z in order to describe the positions of a surface?

If that's the case are you saying you are using partial derivatives to determine a gradient on the surface?

If you can determine the gradient on a 3 dimensional surface... I don't think you need to bother with a normal. You have your down vector.
#16
08/06/2009 (10:05 am)
Joshua,
No you have it right. What we have is a normal in world space, from a bump map, and we are trying to determine the down vector relative to the object surface. The problem is that, I think that the normal stored in the G-buffer is a red herring. I think the data is very close to what we want...but it is only close, and I am not sure any amount of processing (downsampling, gassuan blurring, etc) will get us the result we want.

ddx/ddy are shader instructions, other than that, you have it right. I think Dean is on to something with ignoring the per-pixel normal entirely, and approximating downhill other ways.
#17
08/06/2009 (11:39 am)
Pat,
What about ignoring the G-buffer normal and deriving surface slopes by sampling from the depth instead?
#18
08/06/2009 (12:16 pm)
Quote:Pat,
What about ignoring the G-buffer normal and deriving surface slopes by sampling from the depth instead?

GRAH.. I was going to go home and try this after reading how to extrapolate position from the depth of a G-Buffer and then report back. Good one Manoel.
#19
08/06/2009 (5:31 pm)
bleh... no luck so far
#20
08/07/2009 (1:57 am)
What do you mean it wont be in the release ?

The effects (like the wetness demo that was shown) is one of the reason i even bought this engine, so it better be in (at release date / non beta).
Page «Previous 1 2 3 4 5 6 7 Last »