Physically Based Rendering
by Pierre DragoFire Hay · in Torque 3D Professional · 03/05/2014 (4:39 pm) · 156 replies
I've been slowly reading up on this and wonder if anyone else has looked into this or done any work with it and T3D yet?
For those who don't know what Physically Base Rendering or as some call it Physically Based Shading is here's a topic over at RSI covering what is physically based rendering, this covers most aspects of PBR without getting to technical.
What is PBR or sometimes refered to as PBS or BRDF
The most trivial explanation of a PBR/PBS/BRDF (physical based renderer / physical based shader / bidirectional reflectance distribution function) is that it is the bit of shader code describing how a surface reacts to light. Generally, it is responsible for calculating the specular highlights and diffuse characteristics of the surface material. They are mathematical approximations of how surfaces react to light in the real world. In computer graphics, we try to model the physical world as accurately as possible, but we are constrained by computation. For this reason, the mathematical efficiency of BRDFs is very important. Some of the better known BRDFs - Blinn, Phong and Lambert, for instance - are well known for this reason: they are computationally inexpensive to calculate and intuitive to adjust. However they compromise efficiency for accuracy. If we concern ourselves with more expensive and more accurate models, we uncover a second layer of shading models: Oren-Nayar, Cook-Torrance, Askikhmin-Shirley, etc.
There is no single model that fits every situation, but there are some better than others. The Cook-Torrance model has been shown to be a top performer, when compared against actual acquired BRDF data. Of course, with the good comes the bad and Cook-Torrance is one of the most expensive models to compute. But for overall results, it is hard to beat. So this is our target; a nice implementation of the Cook-Torrance reflectance model. Now this maybe an issue when using Cook-Torrance model on mobiles and consoles due to hardware limitations, so it's a question of which module to use.
What's required?
Base PBR implementation consists of 3 things;
Gamma-correct rendering
[ul]Realistic rendering requires handling values much higher than display white (1.0) Before shading: light intensities, lightmaps, environment maps Shading produces highlights that affect bloom, fog, DoF, motion blur, etc. Cheap solutions exist Good tone mapping (ideally filmic)
Difference between textures and materials.
There are few fundamental distinctions between textures and materials. For example, you cannot apply a texture on a static mesh or BSP geometry. Textures have to be a part of a material. The material is what you would use to texture your environment and apply to Static Meshes.
Here are the differences between a texture and a material:
Textures is a single file, a 2d static image. It is usually a diffuse, specular or a normal map file that you would create in Photoshop or Gimp, as a tga, tiff, bmp, png file. These can be manipulated photographs, hand-painted textures or textures baked in an application such as xNormals.

Materials are made up of various textures combined together inside a Material Editor(in-engine or 3rd party editor). Materials include various textures and material expressions that creates a network of nodes. The final result is a material you can use to apply on your BSP geometry and on Static Meshes. Materials are what you see rendered in-game.
It should be noted that Specular[i] map on a texture, is connected to the [i]refractive index and as such describes a physical property. The shading model then varies this reflectance based on view angle and surface roughness. Thus the surface roughness is adjusted to create variety and specular not be varied for a given material.

A video on PBR for Artist
PBR in Substance by Allegorithmic
Here's a few examples of PBR in action.
tri-Ace Technical Demo Trailer 2011 "Physically-based Rendering"
Star Citizen Avenger PBR
Useful Links:
Pixar writing a Cook-Torrance surface Shader
Pixar writing a BRDF template
Pixar writing a BRDF template part2
Pixar CookTorrance Slim Template
Shading course 2012
Shading course 2013
Physically Based Lighting in Black-Ops
Shader code for physically based lighting
Basic Theory of Physically-Based Rendering
Paprika Render
Mitsuba Render
PBRT Org.
Unreal PBR
Houdini Shaders for physically based rendering (PBR)
D3DBook:(Lighting) Cook-Torrance
Specular BRDF Reference by Brian Karis from Epic game
Books of Interest:
Physically Based Rendering, From Theory to Implementation
ShaderX7
The RenderMan, Shading language Guide
Edit: Updated with new information and links.
For those who don't know what Physically Base Rendering or as some call it Physically Based Shading is here's a topic over at RSI covering what is physically based rendering, this covers most aspects of PBR without getting to technical.
What is PBR or sometimes refered to as PBS or BRDF
The most trivial explanation of a PBR/PBS/BRDF (physical based renderer / physical based shader / bidirectional reflectance distribution function) is that it is the bit of shader code describing how a surface reacts to light. Generally, it is responsible for calculating the specular highlights and diffuse characteristics of the surface material. They are mathematical approximations of how surfaces react to light in the real world. In computer graphics, we try to model the physical world as accurately as possible, but we are constrained by computation. For this reason, the mathematical efficiency of BRDFs is very important. Some of the better known BRDFs - Blinn, Phong and Lambert, for instance - are well known for this reason: they are computationally inexpensive to calculate and intuitive to adjust. However they compromise efficiency for accuracy. If we concern ourselves with more expensive and more accurate models, we uncover a second layer of shading models: Oren-Nayar, Cook-Torrance, Askikhmin-Shirley, etc.
There is no single model that fits every situation, but there are some better than others. The Cook-Torrance model has been shown to be a top performer, when compared against actual acquired BRDF data. Of course, with the good comes the bad and Cook-Torrance is one of the most expensive models to compute. But for overall results, it is hard to beat. So this is our target; a nice implementation of the Cook-Torrance reflectance model. Now this maybe an issue when using Cook-Torrance model on mobiles and consoles due to hardware limitations, so it's a question of which module to use.
What's required?
Base PBR implementation consists of 3 things;
Gamma-correct rendering
- Shading inputs (textures, light colors, vertex colors, etc.) naturally authored, previewed [li]and (often) stored with nonlinear (gamma) encoding
- Final frame buffer also uses nonlinear encoding
- This is done for good reasons
- [li]Perceptually uniform(ish) = efficient use of bits
- Legacy reasons (tools, file formats, hardware)
[ul]
Difference between textures and materials.
There are few fundamental distinctions between textures and materials. For example, you cannot apply a texture on a static mesh or BSP geometry. Textures have to be a part of a material. The material is what you would use to texture your environment and apply to Static Meshes.
Here are the differences between a texture and a material:
Textures is a single file, a 2d static image. It is usually a diffuse, specular or a normal map file that you would create in Photoshop or Gimp, as a tga, tiff, bmp, png file. These can be manipulated photographs, hand-painted textures or textures baked in an application such as xNormals.

Materials are made up of various textures combined together inside a Material Editor(in-engine or 3rd party editor). Materials include various textures and material expressions that creates a network of nodes. The final result is a material you can use to apply on your BSP geometry and on Static Meshes. Materials are what you see rendered in-game.
It should be noted that Specular[i] map on a texture, is connected to the [i]refractive index and as such describes a physical property. The shading model then varies this reflectance based on view angle and surface roughness. Thus the surface roughness is adjusted to create variety and specular not be varied for a given material.

A video on PBR for Artist
PBR in Substance by Allegorithmic
Here's a few examples of PBR in action.
tri-Ace Technical Demo Trailer 2011 "Physically-based Rendering"
Star Citizen Avenger PBR
Useful Links:
Pixar writing a Cook-Torrance surface Shader
Pixar writing a BRDF template
Pixar writing a BRDF template part2
Pixar CookTorrance Slim Template
Shading course 2012
Shading course 2013
Physically Based Lighting in Black-Ops
Shader code for physically based lighting
Basic Theory of Physically-Based Rendering
Paprika Render
Mitsuba Render
PBRT Org.
Unreal PBR
Houdini Shaders for physically based rendering (PBR)
D3DBook:(Lighting) Cook-Torrance
Specular BRDF Reference by Brian Karis from Epic game
Books of Interest:
Physically Based Rendering, From Theory to Implementation
ShaderX7
The RenderMan, Shading language Guide
Edit: Updated with new information and links.
#122
I'm assuming you are referring to the dynamic cubemap?
Also do you think Lukas SSGI shader may help in the lighting department?
www.garagegames.com/community/forums/viewthread/136081
It's working not complete but at least there would be the realtime GI, without re-writing it..Just an idea...
06/17/2014 (11:06 am)
Azaezel,I'm assuming you are referring to the dynamic cubemap?
Also do you think Lukas SSGI shader may help in the lighting department?
www.garagegames.com/community/forums/viewthread/136081
It's working not complete but at least there would be the realtime GI, without re-writing it..Just an idea...
#123
06/17/2014 (11:22 am)
@Kory: sort of. It's a head-fake any way you slice it, but that particular subsystem does do up scene based cubemaps on a schedule. For IBL, that may translate into a set of prefab bakes for most stuff, with a side-order of actual dynamics for extremely reflective surfaces depending on what kind of hit both have. (Might not be that bad a hit given water, but not going to go and say it'd all be fine using dynamic over everything without a side by side comparison.)
#124
Here's the link to the post:
http://www.godotengine.org/forum/viewtopic.php?f=7&t=665
06/17/2014 (12:31 pm)
Not sure if this is relevant to the answer you're looking for in regards to lighting, but the Godot engine has recently added some type of GI solution in their latest builds. It says it bakes light information to a compacted octree that can then be read from for updates. Just curious if what they did with their solution could somehow be adapted to the current problem. The source is MIT so it couldn't hurt to look it over to see if it's useful.Here's the link to the post:
http://www.godotengine.org/forum/viewtopic.php?f=7&t=665
#125
Shaders for physically based rendering (PBR)
07/09/2014 (2:02 pm)
Just came across an interesting page relating to shader for PBR.Shaders for physically based rendering (PBR)
#127
07/09/2014 (4:44 pm)
Should note having talked with the guys at marmoset a bit, we're likely to go with the metalness approach there. It's basically the practice of packing your diffuse and specular into the same file, with a greyscale (typically more like a binary on/off) that determines to what degree the texturesheet is used in the diffuse or specular end of the equation.
#128
I'm still looking in to how to add PBR coding into Torque3D
07/09/2014 (4:50 pm)
For those you happen to have purchased Houdini through GarageGames back when it was release may not know that version 10.0 of Houdini includes PBR funtions, I believe the only issue was with the importing of files into Torque3D, some of the file weren't accepted for some reason. I'll see if I can locate the reasons.I'm still looking in to how to add PBR coding into Torque3D
#129
Gamma-correct rendering
.Shading inputs (textures, light colors, vertex colors, etc.) naturally authored, previewed and (often) stored with nonlinear (gamma) encoding
.Final frame buffer also uses nonlinear encoding
.This is done for good reasons
1/Perceptually uniform(ish) = efficient use of bits
2/Legacy reasons (tools, file formats, hardware)
Support for HDR values
.Realistic rendering requires handling values much higher than display white (1.0)
.Before shading: light intensities, lightmaps, environment maps
.Shading produces highlights that affect bloom, fog, DoF, motion blur, etc.
.Cheap solutions exist
Good tone mapping (ideally filmic)
07/09/2014 (5:15 pm)
To get the most benefit from physically -based shaders, the basics game engine requirements are:Gamma-correct rendering
.Shading inputs (textures, light colors, vertex colors, etc.) naturally authored, previewed and (often) stored with nonlinear (gamma) encoding
.Final frame buffer also uses nonlinear encoding
.This is done for good reasons
1/Perceptually uniform(ish) = efficient use of bits
2/Legacy reasons (tools, file formats, hardware)
Support for HDR values
.Realistic rendering requires handling values much higher than display white (1.0)
.Before shading: light intensities, lightmaps, environment maps
.Shading produces highlights that affect bloom, fog, DoF, motion blur, etc.
.Cheap solutions exist
Good tone mapping (ideally filmic)
#131
Interest in what everyone thinks!
07/09/2014 (5:21 pm)
Just happen to come across shader code for physically based lighting I'll have a good read through later, but from the quick skim over it maybe what we've been looking for to get the ball rolling.Interest in what everyone thinks!
#132
07/09/2014 (5:35 pm)
@Pierre: finishing up www.garagegames.com/community/forums/viewthread/137917 first. Need the color data held on to for the calcs.
#133

The above image shows the Beckmann distribution (top) and Gaussian distribution (bottom) with roughness values of 0.2 (left), 0.4, 0.6, 0.8 and 1.0 (right). Whilst the Gaussian form is noticeably different to the Beckmann form it is worth noting that there is an arbitrary constant controlling the Gaussian distribution. Experimentation with this coefficient can generate closer or more visually acceptable results.

Example of Roughness

The two key inputs into the model are reflectance at normal incidence and roughness. above Image serves to show their relationship and the types of results that can be achieved.
07/09/2014 (5:42 pm)
another one to add to shader code for physically based lighting is D3DBook Lighting Cook-Torrance both pages have code and detailed information.
The above image shows the Beckmann distribution (top) and Gaussian distribution (bottom) with roughness values of 0.2 (left), 0.4, 0.6, 0.8 and 1.0 (right). Whilst the Gaussian form is noticeably different to the Beckmann form it is worth noting that there is an arbitrary constant controlling the Gaussian distribution. Experimentation with this coefficient can generate closer or more visually acceptable results.

Example of Roughness

The two key inputs into the model are reflectance at normal incidence and roughness. above Image serves to show their relationship and the types of results that can be achieved.
#134
Nice work.
This might help, Specular BRDF Reference from Brian Karis from Epic games. While he was working on a new shading model for UE4, he tried few different options for specular BRDF. Specifically, different terms for to Cook-Torrance microfacet specular BRDF.
Have a read.
07/09/2014 (5:56 pm)
@AzaezelNice work.
This might help, Specular BRDF Reference from Brian Karis from Epic games. While he was working on a new shading model for UE4, he tried few different options for specular BRDF. Specifically, different terms for to Cook-Torrance microfacet specular BRDF.
Have a read.
#135
- LEADR mapping, an extension of the Beckmann reflection model with a correct shadowing/masking function. They also provide a sketch for a physically based engine using envmaps: info here
- ABC model and SGD models, this is not so much for the models they think this is not proper BRDF models (not derived from first principles), but they show that data is mostly not following the Beckmann model (facet distributions, if they define something real are not Gaussian): here and here (one is behind paywall).
- Real-time shading with filtered importance sampling is a way to use ray shooting style light simulation in shader based renderers. It might be slow compared to what engines like UE have under the hood: here
Here's what they said after reading what we've talked about so far here.
07/10/2014 (10:04 pm)
Been advised by a couple of PBR developers to look at those articles:- LEADR mapping, an extension of the Beckmann reflection model with a correct shadowing/masking function. They also provide a sketch for a physically based engine using envmaps: info here
- ABC model and SGD models, this is not so much for the models they think this is not proper BRDF models (not derived from first principles), but they show that data is mostly not following the Beckmann model (facet distributions, if they define something real are not Gaussian): here and here (one is behind paywall).
- Real-time shading with filtered importance sampling is a way to use ray shooting style light simulation in shader based renderers. It might be slow compared to what engines like UE have under the hood: here
Here's what they said after reading what we've talked about so far here.
Quote:Thank you for your link, it is interesting to read the development of PBR within a game engine. From what I understand, your are currently trying to gauge what route you should take. PBR is the combinaison of three things (in my own understanding): physically based materials (derived from first principles), physically based lighting, and light transport algorithm. Miss one of them and you rendering engine is not a PBR one. All this is very simple for ray tracing nerds like me since light simulation (up to the geometrical optics simplification) can be achieve with shooting rays. For real-time rendering using the GPU pipeline, it seems way more complicated (to me). The physically based engine is no longer easy to implement since shooting rays is not possible (at least the amount you would need).
#136
For reflectance/IBL, I did a small write-up in the linked thread for that, but to expand a little bit: at present, every material defined has a cubemap entry which eventually feeds it's self to a shaderConst (What's passed along to the pre-defined, or shadergen created hlsl.) For the unfamiliar, this consists of 6 maps, typically conceived of as North, East, South, West, Up, Down, wrapped around an object, and blended in some form with the diffuse properties. What I was referring to as an old head-fake earlier.
As an example, a colorized one: http://i.imgur.com/0k9ycNj.jpg
For phase 2 (the PBR work), thinking at a minimum we'll go with an object that holds that value for nearby assets to reference so that they can adapt to their placement on a level in that regard. Depending on GPU load, this may or may not be dynamically updated constantly, but will derive from the same functionality that present dynamiccubemapping has in regards to taking a snapshot of a scene, even if we only take that snapshot and cache it for re-use on level load. The one consultant who swung by IRC also mentioned something along the lines of adapting the MIP maps for quicker referencing regarding differing levels of diffusion if I recall correctly. Something to look into once further actual work has progressed.
07/10/2014 (11:09 pm)
They advised sticking with the blinn-phong variant, given that and cook-torrence give yet more variation in outcome for folks to account for.For reflectance/IBL, I did a small write-up in the linked thread for that, but to expand a little bit: at present, every material defined has a cubemap entry which eventually feeds it's self to a shaderConst (What's passed along to the pre-defined, or shadergen created hlsl.) For the unfamiliar, this consists of 6 maps, typically conceived of as North, East, South, West, Up, Down, wrapped around an object, and blended in some form with the diffuse properties. What I was referring to as an old head-fake earlier.
As an example, a colorized one: http://i.imgur.com/0k9ycNj.jpg
For phase 2 (the PBR work), thinking at a minimum we'll go with an object that holds that value for nearby assets to reference so that they can adapt to their placement on a level in that regard. Depending on GPU load, this may or may not be dynamically updated constantly, but will derive from the same functionality that present dynamiccubemapping has in regards to taking a snapshot of a scene, even if we only take that snapshot and cache it for re-use on level load. The one consultant who swung by IRC also mentioned something along the lines of adapting the MIP maps for quicker referencing regarding differing levels of diffusion if I recall correctly. Something to look into once further actual work has progressed.
#137
From what I understand from shaderx7 cook-torrence is the better method, blinn-phong has been around a while and does a good good of emulating.
I'll post tomorrow what I mean.
07/11/2014 (4:56 am)
@Azaezel From what I understand from shaderx7 cook-torrence is the better method, blinn-phong has been around a while and does a good good of emulating.
I'll post tomorrow what I mean.
#138
One of the earliest and most ubiquitous shading models is due to Bui-Tuong Phong together with its modification by Jim Blinn ["Illumination for Computer Generated pictures" 1975, "Models of light reflection for computer synthesized pictures" 1977]. It is an empirical model (i.e., "made-up") for a point on a surface lit by a number of discrete point light sources. The spectral intensity of this, as seen by a viewer, is decomposed into four components. These are, in order of increasing computational cost, the ambient, diffuse, and specular components (ignoring the emissive component).
They are characterized as follows;
07/11/2014 (4:31 pm)
Review Blinn-Phong:One of the earliest and most ubiquitous shading models is due to Bui-Tuong Phong together with its modification by Jim Blinn ["Illumination for Computer Generated pictures" 1975, "Models of light reflection for computer synthesized pictures" 1977]. It is an empirical model (i.e., "made-up") for a point on a surface lit by a number of discrete point light sources. The spectral intensity of this, as seen by a viewer, is decomposed into four components. These are, in order of increasing computational cost, the ambient, diffuse, and specular components (ignoring the emissive component).
They are characterized as follows;
- The ambient component models a uniform field as a crude approximation for the combined effect of all indirect light. Ambient illumination is assumed to have equal intensity from all directions.
- The diffuse component assumes a Lambertian response to direct illumination from a discrete light source. This response is simply proportional to the dot product between surface normal and light direction.
- The specular component models the effect of mirror images (highlights) of discrete light sources via a simple formula. The fuzziness of these mirror images can be adjusted by an additional parameter that controls the appearance of surface roughness. At this point, the models differ in their approach: While the Phong model generates perfect reflections of fuzzed light sources, the Blinn model generates fuzzly reflections of perfect point lights. Both models are idealizations, but in reality, the Blinn model is closer to the observation more often.
#139
The Cook-Torrance shader model, developed rigorously from the theory of micro-facets ["A reflectance model for computer graphics" 1981]. This model puts the specular component on a physical basis, while the diffuse and ambient components are the same. The full Cook-Torrance model is expensive to compute, but it makes the important contribution of separating the specular component into three factors: a distribution factor accounting for surface roughness, a geometry factor accounting for self-shadowing and occlusion, and the Fresnel term. This separation allows searching for approximations to each of these factors independently.
The following code from D3D Book lighting Cook-Torrance can be used to create the look-up texture:
07/11/2014 (4:59 pm)
Review Cook-Torrance:The Cook-Torrance shader model, developed rigorously from the theory of micro-facets ["A reflectance model for computer graphics" 1981]. This model puts the specular component on a physical basis, while the diffuse and ambient components are the same. The full Cook-Torrance model is expensive to compute, but it makes the important contribution of separating the specular component into three factors: a distribution factor accounting for surface roughness, a geometry factor accounting for self-shadowing and occlusion, and the Fresnel term. This separation allows searching for approximations to each of these factors independently.
The following code from D3D Book lighting Cook-Torrance can be used to create the look-up texture:
HRESULT CreateRoughnessLookupTexture( ID3D10Device* pDevice )
{
HRESULT hr = S_OK;
// The dimension value becomes a trade-off between
// quality and storage requirements
const UINT LOOKUP_DIMENSION = 512;
// Describe the texture
D3D10_TEXTURE2D_DESC texDesc;
texDesc.ArraySize = 1;
texDesc.BindFlags = D3D10_BIND_SHADER_RESOURCE;
texDesc.CPUAccessFlags = 0;
texDesc.Format = DXGI_FORMAT_R32_FLOAT;
texDesc.Height = LOOKUP_DIMENSION;
texDesc.Width = LOOKUP_DIMENSION;
texDesc.MipLevels = 1;
texDesc.MiscFlags = 0;
texDesc.SampleDesc.Count = 1;
texDesc.SampleDesc.Quality = 0;
texDesc.Usage = D3D10_USAGE_IMMUTABLE;
// Generate the initial data
float* fLookup = new float[ LOOKUP_DIMENSION*LOOKUP_DIMENSION ];
for( UINT x = 0; x < LOOKUP_DIMENSION; ++x )
{
for( UINT y = 0; y < LOOKUP_DIMENSION; ++y )
{
// This following fragment is a direct conversion of
// the code that appears in the HLSL shader
float NdotH = static_cast< float >( x )
/ static_cast< float >( LOOKUP_DIMENSION );
float Roughness = static_cast< float >( y )
/ static_cast< float >( LOOKUP_DIMENSION );
// Convert the 0.0..1.0 ranges to be -1.0..+1.0
NdotH *= 2.0f;
NdotH -= 1.0f;
// Evaluate a Beckmann distribution for this element
// of the look-up table:
float r_sq = Roughness * Roughness;
float r_a = 1.0f / ( 4.0f * r_sq * pow( NdotH, 4 ) );
float r_b = NdotH * NdotH - 1.0f;
float r_c = r_sq * NdotH * NdotH;
fLookup[ x + y * LOOKUP_DIMENSION ]
= r_a * expf( r_b / r_c );
}
}
D3D10_SUBRESOURCE_DATA initialData;
initialData.pSysMem = fLookup;
initialData.SysMemPitch = sizeof(float) * LOOKUP_DIMENSION;
initialData.SysMemSlicePitch = 0;
// Create the actual texture
hr = pDevice->CreateTexture2D
(
&texDesc,
&initialData,
&g_pRoughnessLookUpTex
);
if( FAILED( hr ) )
{
ERR_OUT( L"Failed to create look-up texture" );
SAFE_DELETE_ARRAY( fLookup );
return hr;
}
// Create a view onto the texture
ID3D10ShaderResourceView* pLookupRV = NULL;
hr = pDevice->CreateShaderResourceView
(
g_pRoughnessLookUpTex,
NULL,
&pLookupRV
);
if( FAILED( hr ) )
{
SAFE_RELEASE( pLookupRV );
SAFE_RELEASE( g_pRoughnessLookUpTex );
SAFE_DELETE_ARRAY( fLookup );
return hr;
}
// Bind it to the effect variable
ID3D10EffectShaderResourceVariable *pFXVar
= g_pEffect->GetVariableByName("texRoughness")->AsShaderResource( );
if( !pFXVar->IsValid() )
{
SAFE_RELEASE( pLookupRV );
SAFE_RELEASE( g_pRoughnessLookUpTex );
SAFE_DELETE_ARRAY( fLookup );
return hr;
}
pFXVar->SetResource( pLookupRV );
// Clear up any intermediary resources
SAFE_RELEASE( pLookupRV );
SAFE_DELETE_ARRAY( fLookup );
return hr;
}
#140
This code is part of Cook Torrance Shader from Pixar:
07/11/2014 (5:16 pm)
Another good page on Cook-Torrance is at pixar.This code is part of Cook Torrance Shader from Pixar:
// Copyright (c) 2007 PIXAR. All rights reserved. This program or
// documentation contains proprietary confidential information and trade
// secrets of PIXAR. Reverse engineering of object code is prohibited.
// Use of copyright notice is precautionary and does not imply
// publication.
//
// RESTRICTED RIGHTS NOTICE
//
// Use, duplication, or disclosure by the Government is subject to the
// following restrictions: For civilian agencies, subparagraphs (a) through
// (d) of the Commercial Computer Software--Restricted Rights clause at
// 52.227-19 of the FAR; and, for units of the Department of Defense, DoD
// Supplement to the FAR, clause 52.227-7013 (c)(1)(ii), Rights in
// Technical Data and Computer Software.
//
// Pixar Animation Studios
// 1200 Park Avenue
// Emeryville, CA 94608
//
//------------------------------------------------------------------------------------------------------//
//------------------------------------------------------------------------------------------------------//
// SCRIPT: CookTorrance.sl
// AUTHOR: Scott Eaton
// DATE: July 3, 2007
//
// DESCRIPTION: A simple implementation of the Cook-Torrance
// shading model describe in:
// A Reflectance Model for Computer Graphics
// R. L. Cook, K. E. Torrance, ACM Transactions on Graphics 1982
//
//------------------------------------------------------------------------------------------------------//
//------------------------------------------------------------------------------------------------------//
surface CookTorrance(
float Ka = 1;
float Ks = .8;
float Kd = .8;
float IOR = 1.3;
float roughness = .2;
color opacity = 1;
color specularColor = 1;
color diffuseColor = (.6, .6, .6);
float gaussConstant = 100;
){
//the things we need:
// normalized normal and vector to eye
normal Nn = normalize(N);
vector Vn = normalize(-I);
float F, Ktransmit;
float m = roughness;
fresnel( normalize(I), Nn, 1/IOR, F, Ktransmit);
color cook = 0;
float NdotV = Nn.Vn;
illuminance( P, Nn, PI/2 ){
//half angle vector
vector Ln = normalize(L);
vector H = normalize(Vn+Ln);
float NdotH = Nn.H;
float NdotL = Nn.Ln;
float VdotH = Vn.H;
float D;
float alpha = acos(NdotH);
//microfacet distribution
D = gaussConstant*exp(-(alpha*alpha)/(m*m));
//geometric attenuation factor
float G = min(1, min((2*NdotH*NdotV/VdotH), (2*NdotH*NdotL/VdotH)));
//sum contributions
cook += Cl*(F*D*G)/(PI*NdotV);
}
cook = cook/PI;
Oi = opacity;
Ci = (Kd*diffuseColor*diffuse(Nn)+Ks*specularColor*cook) * Oi;
}
Azaezel
Good breakdown. First focus has been extending the shader system via additional options and extending the prepass with additional data accessability.
may end up doing up a first-rev proof of concept embeding an ibl in a sun for level-consistency, (Again, as a first revision to prove out getting additional data from a set object.) but if someone wishes to look into: http://www.garagegames.com/community/blogs/view/21569/1#comment-183443 with an eye towards decoupling the requirements from shapebase and derivatives, that'd likely speed things along on that score.
Knocked those on out along the way, or at least a general poke at it. realtime GI may take fresh code.
We'll have a more thorough report soon(tm).