SlideShare a Scribd company logo
1 of 78
“Next Gen” Effects ?
CryEngine 2: Shading Overview
• Support shader model 2.0, up to 4.0
• Completely dynamic lighting and shadows
• Up to 4 point light sources per-pass
• Wide range of known and DIY shading models
• Some other fancy features
• Deferred mix with multi-pass rendering approach
• Average of 2K drawcalls per frame (~2M tris)
Water and Underwater Rendering
• Intro water rendering video
Water and Underwater Rendering
• Rendering believable looking water
• Underwater light-scattering [1]
• Water surface animation & tessellation
• Reflections/Refraction
• Shore/Foam
• Caustics and God-rays
• Camera and objects interaction with water
• Particles
• How to make all this efficiently in a very complex
and open ended world in a game like Crysis ?
No more flat water !
• 3D waves
• Used statistical Tessendorf animation model [2]
• Computed on CPU for a 64x64 grid
• Upload results into a FP32 texture
• Vertex displacement on GPU
• Lower HW specs used sin waves sum
• Additionally 4 moving normals maps layers
Surface Tessellation
Screen Space Grid Projection
Extreme detail nearby
Problems
Screen edges
Aliasing at distance
Dropped this approach in the end
Surface Tessellation
Camera aligned grid
Keep detail nearby and in front of camera
Problems
Camera roll
Balancing nearby VS far away detail
Kept this approach in the end
Surface from a top perspective
Physics Interaction
• CPU animation is shared with Physics/Game
• For lowspec machines, did same “math” as in
vertex shader on CPU
• Physics samples best water plane fitting object
Reflection
Per frame, we had an avg of 2K drawcalls (~ 2M tris)
• This really needed to be cheap – and look good
• Problem: Reflections added about 1500 drawcalls
• Draw calls minimization
• Final average of 300 drawcalls for reflections
• Total internal reflection also simulated
• Half screen resolution RT
Reflection Update Trickery
Update Dependencies
• Time
• Camera orientation/position difference from
previous camera orientation/position
• Surface visibility ratio using HW occlusion queries
• Multi-GPU systems need extra care to avoid out of
sync reflection texture
Anisotropic Reflection
• Blur final reflection texture vertically
• Also helps minimizing reflection aliasing
Refraction
• No need to render scene again [3]
• Use current back-buffer as input texture
• Mask out everything above water surface
• Water depth > World depth = leaking
• Don’t allow refraction texture offset for this case
• Chromatic dispersion approx. for interesting look
• Scaled offset for R, G and B differently
Refraction Masking
Chromatic Dispersion
Procedural Caustics
• Extra rendering pass, can handle
opaque/transparent
• Based on real sun direction projection
• Procedural composed using 3 layers
• Chromatic dispersion approximation
• Darken slightly to simulate wet surface
Procedural Caustics
Shore and Foam
• Soft water intersection
• Shore blended based on surface depth distance to
world depth
• Waves foam blended based on current height
distance to maximum height
• Foam is composed of 2 moving layers with offsets
perturbation by water bump
• Acceptable quality
Shore and Foam
Underwater God-Rays [4]
• Essentially the same procedural caustics shader
• Based on real sun direction projection
• Projected into multiple planes in front of camera
• Rendered into a 4x smaller than screen RT
• Finally add to frame-buffer
Light Scattering + Caustics + God-Rays
(exaggerated for picture)
Underwater God-Rays
Camera/Particles Interaction
• How to handle case where camera intersects an
animated water surface ?
• Water droplets effect when coming out of water
• When inside water used a subtle distortion
• Water particles similar to soft-particles
Things for the Future
• Rolling waves didn’t made into final game
• Special animated water decal geometry
• Water splashes
• Surface interaction with shoreline
• Dynamic surface interaction
• Maybe in nearby future project ?   
Frozen Surfaces
• Intro frozen surfaces video
Frozen Surfaces
• Huge Headache
• Haven’t found previous research on the subject
• Unique Alien Frozen World: How to make it ?
• Should it look realistic ?
• Or an “artistic” flash frozen world ?
• Make everything Frozen ?
• Dynamically ?
• Custom frozen assets ?
• Reuse assets ?
• Took us 4 iterations until final result
Lessons learned
Final iteration
• Main focus was to make it visually interesting
• Used real world images as reference this time
• Minimize artist amount of work as much as
possible
• Impossible to make every single kind of object look
realistically frozen (and good) with a single unified
approach 
• 1 week before hitting feature lock/alpha (gulp)
Putting all together
• Accumulated snow on top
• Blend in snow depending on WS/OS normal z
• Frozen water droplets accumulated on side
• 2 layers using different uv and offset bump scales to give
impression of volume
• 3D Perlin noise used for blending variation
• 3 octaves and offsets warping to avoid repetitive patterns
• Glittering
• Used a 2D texture with random noise vectors
• Pow( saturate( dot(noise, viewVector), big value)
• If result > threshold, multiply by big value for hdr to kick in
Procedural Frozen
• Reused assets
• Dynamic freezing possibility and with nice
transition
• Didn’t gave artists more control than required
• Artists just press button to enable frozen
• Relatively cheap, rendering cost wise
• Visually interesting results
• Only looks believable under good lighting
conditions
Post-Effects
• Intro post-effects video
Post-Effects Overview
Post Effects Mantra:
• Final rendering “make-up”
• Minimal aliasing (for very-hi specs)
• Never sacrifice quality over speed
• Unless you’re doing really crazy expensive stuff !
• Make it as subtle as possible
• But not less - or else average gamer will not notice it
Camera Motion Blur (CMB)
LDR
No bright streaks
Washed out details HDR
Screen-space velocities
• Render a sphere around camera
• Use previous/current camera transformation to
compute delta vector
• Lerp between previous/current transformation by a
shutter speed ratio ( n / frame delta ), to get correct
previous camera matrix
• From previous/current positions compute velocity vector
• Can already accumulate N samples along velocity
direction
• But will get constant blurring everywhere
Velocity Mask
• Used depth to generate velocity mask
• We let camera rotation override this mask
• Depth is used to mask out nearby geometry
• If current pixel depth < nearby threshold write 0
• Value used for blending out blur from first person
arms/weapons
• Velocity mask is used later as a scale for motion
blurring velocity offsets
• Blurring amount scales at distance now
CMB Vertex Shader Sample
vPos.xyz += vWorldViewPos.xyz;
float4 vNewPos = mul(mViewProj, vPos);
float4 vPrevPos = mul( mViewProjPrev, vPos );
OUT.HPosition = vNewPos;
OUT.vCurr = HPosToScreenTC( vNewPos );
OUT.vPrev = HPosToScreenTC( vPrevPos );
CMB Pixel Shader Sample
half4 cMidCurr = tex2Dproj(screenMap, IN.vCurr);
half fDepth = tex2Dproj(depthMap,IN.vCurr).x*NearFarClipDist.y;
float2 vStart = IN.vCurr.xy/IN.vCurr.w;
float2 vPrev = (IN.vPrev.xy/IN.vVPrev.w) * fScale;
float2 vCurr = vStart * fScale;
float2 vStep = vPrev - vCurr;
float4 accum = 0;
[unroll]
for(float s = -1.0; s < 1.0 ; s += fWeightStep ) {
float2 tcFinal = vCurr.xy - vStep.xy * s;
// Apply depth scaling/masking
half fDepthMask = tex2D(screenMap, tcFinal).w;
tcFinal += vStep.xy * (s - s * fDepthMask);
accum += tex2D(screenMap, tcFinal );
}
accum *= fWeight;
// Remove remaining scene bleeding from 1st player hands
Improving quality
• Iterative sampling approach
• First pass uses 8 samples
• Ping-pong results
• Second pass uses blurred results, this results in 8
* 8 samples (virtually 64)
• 3rd = 512 samples, 4th = 4096, etc
• High quality at relatively low cost
Iterative quality improve
Optimization strategies
• If no movement skip camera motion blur entirely
• Compare previous camera transformation with current
• Estimate required sample count based on camera
translation/orientation velocity
• If sample count below certain threshold, skip
• Adjust pass/sample count accordingly
• This gave a nice performance boost
• Average case at 1280x1024 runs at ~ 1 ms on a G80
Object Motion Blur (OMB)
LDR
Bright streaks gone
Washed out details
HDR
Bright Streaks
Sharper Details
DX9 HW Skinning limitation
• 256 vertex constant registers limit
• Our characters have an average of 70 bones per
drawcall
• Each bone uses 2 registers = 140 registers
• For motion blur we need previous frame bones
transformations
• 2 x 140 = 280 registers, bummer..
• Decided for DX10 only solution
Step 1: Velocity pass
• Render screen space velocity, surface depth and
instance ID into a FP16 RT
• If no movement, skip rigid/skinned geometry
• Compare previous transformation matrix with current
• Compare previous bone transformations with current
• If per-pixel velocity below certain threshold write 0
to RT
• Can use this data for optimizing further
Why dilation needed ?
Step 2: Velocity Mask
• Used a 8x smaller render target
• Apply Gaussian blur to velocity length
• Result is a reasonable fast estimation of screen
space needed for dilation and motion blurring
• Mask is used to efficiently skip pixels during
dilation passes
Step 3: Velocity Dilation
• Edge dilation
• Done using separated vertical and horizontal
offsets
• 4 passes total (2 for horizontal, 2 for vertical)
• If center offset has velocity or velocity mask is 0
skip processing entirely
• Dilate if world depth > surface depth
Dilation shader sample
float4 vCenterVZID = tex2D(tex0, tc.xy);
float fCenterDepth = GetResampledDepth(tex1, tc.xy);
float fOffsetScale = tex2D(tex2, tc.xy).x;
if( fOffsetScale == 0 || dot(vCenterVZID.xy, vCenterVZID.xy) ){
OUT.Color = float4(vCenterVZID.xyzw) ;
return OUT;
}
[unroll]
for( int n = 0; n < nOffsets; n++ ) {
float2 tcLookup = tc.xy + vOffsets[n].xy *vScrSizeRecip;
float4 vCurrVZID = tex2Dlod(tex0, float4(tcLookup , 0, 0));
float fDepthCmp = saturate( fCenterDepth- vCurrVZID.z );
fDepthCmp *= dot(vCurrVZID.xy, vCurrVZID.xy);
fDepthCmp *= Dilated.z == 0;
if( fDepthCmp)
Dilated = vCurrVZID;
}
Velocity Dilation
Original Step 1 Step 2
Final Step: Motion Blurring
• Accumulate N samples along velocity direction
• Can use surface ID to avoid leaking
• Extra lookup..
• Clamp velocity to acceptable range
• Very important to avoid ugly results, especially with jerky
animations
• Divide accumulated results by sample count and
lerp between blurred/non blurred
• Using alpha channel blend mask
OMB Pixel Shader Sample
float4 cScreen = tex2Dlod(tex0, float4(tc.xy, 0, 0));
float4 cVelocity = tex2Dlod(tex1, float4(tc.xy, 0, 0));
OUT.Color = cScreen;
if( dot(cVelocity.xy, cVelocity.xy) < fThreshold )
return OUT;
float4 cAccum = 0;
float fLen = length(cVelocity.xy);
if( fLen ) cVelocity.xy /= fLen;
cVelocity.xy *= min(fLen, vMaxRange) //Clamp velocity to MaxRange
[unroll]
for(float i = 0; i < nSamples; i++) {
float2 tcMB = cVelocity * ((i * fRecipSamples)-0.5) + tc;
float4 cCurr = tex2Dlod(tex0, float4(tcMB, 0, 0));
cAccum += float4(cCurr.xyz, saturate(10000 * cCurr.w));
}
if( cAccum.w ) { // Blend with scene
cAccum *= fRecipSamples;
OUT.Color = lerp(cScreen, cAccum, saturate( cAccum.w * 2) );
}
Blend Mask
Blend with scene
Object Motion Blur
• Object motion blur with per-pixel dilation
Not perfect, but good quality results for most cases
Geometry independent
Problematic with alpha blending
• Future improvements / challenges
Self-motion blurring
Multiple overlapping geometry + motion blurring
Could use adaptive sample count for blurring
Sun Shafts [5]
aka Crepuscular Rays/God Rays/Sun beams/Ropes of Maui…
Screen Space Sun Shafts
• Generate depth mask
• Mask = 1 - normalized scene depth
• Perform local radial blur on depth mask
• Compute sun position in screen space
• Blur vector = sun position - current pixel position
• Iterative quality improvement
• Used 3 passes (virtually = 512 samples)
• RGB = sun-shafts, A = vol. fog shadow aprox
• Compose with scene
• Sun-shafts = additive blending
• Fog-shadows = soft-blend [5]
Depth Mask Radial Blurring
Depth Mask 8 samples 64 samples 512 samples
Sun Shafts: Results
Color Grading
• Artist control for final image touches
• Depending on “time of day” in game
• Night mood != Day mood, etc
• Usual saturation, contrast, brightness controls and
color levels like in Far Cry [6]
• Image sharpening through extrapolation [9]
• Selective color correction
• Limited to a single color correction
• Photo Filter
• Grain filter
Image Sharpening
Selective Color Correction
• Color range based on Euclidian distance
• ColorRange = saturate(1 - length( src - col.xyz) );
• Color correction done in CMYK color space [8]
• c = lerp( c, clamp(c + dst_c, -1, 1), ColorRange);
• Finally blend between original and correct color
• Orig =lerp( Orig, CMYKtoRGB( c), ColorRange);
Photo Filter
• Blend entire image into a different color mood
• Artist defines “mood color”
• cMood = lerp(0, cMood, saturate( fLum * 2.0 ) );
• cMood = lerp(cMood, 1, saturate( fLum - 0.5 ) * 2.0 );
• Final color is a blend between mood color and
backbuffer based on luminance and user ratio
• final= lerp(cScreen, cMood , saturate( fLum * fRatio));
Conclusion
• Awesome time to be working in real time computer
graphics
• Hollywood movie image quality still far away
• Some challenges before we can reach it
• Need much more GPU power (Crysis is GPU bound)
• Order independent alpha blending – for real world cases
• Good reflection/refraction
We still need to get rid of aliasing – Everywhere
Special Thanks
• To entire Crytek team
• All work presented here wouldn’t have been
possible without your support 
References
• [1] Wenzel, “Real time atmospheric effects”, GDC 2007
• [2] Tessendorf, “Simulating Ocean Water”, 1999
• [3] Sousa, “Generic Refraction Simulation”, Gpu Gems 2, 2004
• [4] Jensen + et al, “Deep Water Animation and Rendering”,
2001
• [5] Nishita + et al, “A Fast Rendering Method for Shafts of Light
in Outdoor Scene”, 2006
• [6] Gruschel, “Blend Modes”, 2006
• [7] Bjorke, “Color Controls”, GPU Gems 1
• [8] Green, “OpenGL Shader Tricks”, 2003
• [9] Ford + et al, “Colour Space Conversions”, 1998
• [10] Haerberli + et al, “Image processing by Interpolation and
Extrapolation”, 1994
Questions ?
Tiago@Crytek.de
Crytek is hiring !
http://www.crytek.com/jobs/

More Related Content

What's hot

4K Checkerboard in Battlefield 1 and Mass Effect Andromeda
4K Checkerboard in Battlefield 1 and Mass Effect Andromeda4K Checkerboard in Battlefield 1 and Mass Effect Andromeda
4K Checkerboard in Battlefield 1 and Mass Effect AndromedaElectronic Arts / DICE
 
FrameGraph: Extensible Rendering Architecture in Frostbite
FrameGraph: Extensible Rendering Architecture in FrostbiteFrameGraph: Extensible Rendering Architecture in Frostbite
FrameGraph: Extensible Rendering Architecture in FrostbiteElectronic Arts / DICE
 
Rendering AAA-Quality Characters of Project A1
Rendering AAA-Quality Characters of Project A1Rendering AAA-Quality Characters of Project A1
Rendering AAA-Quality Characters of Project A1Ki Hyunwoo
 
CryENGINE 3 Rendering Techniques
CryENGINE 3 Rendering TechniquesCryENGINE 3 Rendering Techniques
CryENGINE 3 Rendering TechniquesTiago Sousa
 
Penner pre-integrated skin rendering (siggraph 2011 advances in real-time r...
Penner   pre-integrated skin rendering (siggraph 2011 advances in real-time r...Penner   pre-integrated skin rendering (siggraph 2011 advances in real-time r...
Penner pre-integrated skin rendering (siggraph 2011 advances in real-time r...JP Lee
 
Screen Space Reflections in The Surge
Screen Space Reflections in The SurgeScreen Space Reflections in The Surge
Screen Space Reflections in The SurgeMichele Giacalone
 
Five Rendering Ideas from Battlefield 3 & Need For Speed: The Run
Five Rendering Ideas from Battlefield 3 & Need For Speed: The RunFive Rendering Ideas from Battlefield 3 & Need For Speed: The Run
Five Rendering Ideas from Battlefield 3 & Need For Speed: The RunElectronic Arts / DICE
 
Graphics Gems from CryENGINE 3 (Siggraph 2013)
Graphics Gems from CryENGINE 3 (Siggraph 2013)Graphics Gems from CryENGINE 3 (Siggraph 2013)
Graphics Gems from CryENGINE 3 (Siggraph 2013)Tiago Sousa
 
The Rendering Pipeline - Challenges & Next Steps
The Rendering Pipeline - Challenges & Next StepsThe Rendering Pipeline - Challenges & Next Steps
The Rendering Pipeline - Challenges & Next StepsJohan Andersson
 
Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14
Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14
Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14AMD Developer Central
 
The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)
The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)
The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)Philip Hammer
 
Moving Frostbite to Physically Based Rendering
Moving Frostbite to Physically Based RenderingMoving Frostbite to Physically Based Rendering
Moving Frostbite to Physically Based RenderingElectronic Arts / DICE
 
An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...
An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...
An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...Codemotion
 
Taking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next GenerationTaking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next GenerationGuerrilla
 
Volumetric Lighting for Many Lights in Lords of the Fallen
Volumetric Lighting for Many Lights in Lords of the FallenVolumetric Lighting for Many Lights in Lords of the Fallen
Volumetric Lighting for Many Lights in Lords of the FallenBenjamin Glatzel
 
Triangle Visibility buffer
Triangle Visibility bufferTriangle Visibility buffer
Triangle Visibility bufferWolfgang Engel
 
HPG 2018 - Game Ray Tracing: State-of-the-Art and Open Problems
HPG 2018 - Game Ray Tracing: State-of-the-Art and Open ProblemsHPG 2018 - Game Ray Tracing: State-of-the-Art and Open Problems
HPG 2018 - Game Ray Tracing: State-of-the-Art and Open ProblemsElectronic Arts / DICE
 

What's hot (20)

Frostbite on Mobile
Frostbite on MobileFrostbite on Mobile
Frostbite on Mobile
 
4K Checkerboard in Battlefield 1 and Mass Effect Andromeda
4K Checkerboard in Battlefield 1 and Mass Effect Andromeda4K Checkerboard in Battlefield 1 and Mass Effect Andromeda
4K Checkerboard in Battlefield 1 and Mass Effect Andromeda
 
FrameGraph: Extensible Rendering Architecture in Frostbite
FrameGraph: Extensible Rendering Architecture in FrostbiteFrameGraph: Extensible Rendering Architecture in Frostbite
FrameGraph: Extensible Rendering Architecture in Frostbite
 
Rendering AAA-Quality Characters of Project A1
Rendering AAA-Quality Characters of Project A1Rendering AAA-Quality Characters of Project A1
Rendering AAA-Quality Characters of Project A1
 
CryENGINE 3 Rendering Techniques
CryENGINE 3 Rendering TechniquesCryENGINE 3 Rendering Techniques
CryENGINE 3 Rendering Techniques
 
Penner pre-integrated skin rendering (siggraph 2011 advances in real-time r...
Penner   pre-integrated skin rendering (siggraph 2011 advances in real-time r...Penner   pre-integrated skin rendering (siggraph 2011 advances in real-time r...
Penner pre-integrated skin rendering (siggraph 2011 advances in real-time r...
 
Screen Space Reflections in The Surge
Screen Space Reflections in The SurgeScreen Space Reflections in The Surge
Screen Space Reflections in The Surge
 
Five Rendering Ideas from Battlefield 3 & Need For Speed: The Run
Five Rendering Ideas from Battlefield 3 & Need For Speed: The RunFive Rendering Ideas from Battlefield 3 & Need For Speed: The Run
Five Rendering Ideas from Battlefield 3 & Need For Speed: The Run
 
Light prepass
Light prepassLight prepass
Light prepass
 
Graphics Gems from CryENGINE 3 (Siggraph 2013)
Graphics Gems from CryENGINE 3 (Siggraph 2013)Graphics Gems from CryENGINE 3 (Siggraph 2013)
Graphics Gems from CryENGINE 3 (Siggraph 2013)
 
The Rendering Pipeline - Challenges & Next Steps
The Rendering Pipeline - Challenges & Next StepsThe Rendering Pipeline - Challenges & Next Steps
The Rendering Pipeline - Challenges & Next Steps
 
Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14
Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14
Vertex Shader Tricks by Bill Bilodeau - AMD at GDC14
 
The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)
The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)
The Rendering Technology of 'Lords of the Fallen' (Game Connection Europe 2014)
 
Moving Frostbite to Physically Based Rendering
Moving Frostbite to Physically Based RenderingMoving Frostbite to Physically Based Rendering
Moving Frostbite to Physically Based Rendering
 
An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...
An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...
An introduction to Realistic Ocean Rendering through FFT - Fabio Suriano - Co...
 
The Unique Lighting of Mirror's Edge
The Unique Lighting of Mirror's EdgeThe Unique Lighting of Mirror's Edge
The Unique Lighting of Mirror's Edge
 
Taking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next GenerationTaking Killzone Shadow Fall Image Quality Into The Next Generation
Taking Killzone Shadow Fall Image Quality Into The Next Generation
 
Volumetric Lighting for Many Lights in Lords of the Fallen
Volumetric Lighting for Many Lights in Lords of the FallenVolumetric Lighting for Many Lights in Lords of the Fallen
Volumetric Lighting for Many Lights in Lords of the Fallen
 
Triangle Visibility buffer
Triangle Visibility bufferTriangle Visibility buffer
Triangle Visibility buffer
 
HPG 2018 - Game Ray Tracing: State-of-the-Art and Open Problems
HPG 2018 - Game Ray Tracing: State-of-the-Art and Open ProblemsHPG 2018 - Game Ray Tracing: State-of-the-Art and Open Problems
HPG 2018 - Game Ray Tracing: State-of-the-Art and Open Problems
 

Viewers also liked

Siggraph2016 - The Devil is in the Details: idTech 666
Siggraph2016 - The Devil is in the Details: idTech 666Siggraph2016 - The Devil is in the Details: idTech 666
Siggraph2016 - The Devil is in the Details: idTech 666Tiago Sousa
 
Star Ocean 4 - Flexible Shader Managment and Post-processing
Star Ocean 4 - Flexible Shader Managment and Post-processingStar Ocean 4 - Flexible Shader Managment and Post-processing
Star Ocean 4 - Flexible Shader Managment and Post-processingumsl snfrzb
 
09_motionblur
09_motionblur09_motionblur
09_motionblurnoerror
 
Practical Occlusion Culling in Killzone 3
Practical Occlusion Culling in Killzone 3Practical Occlusion Culling in Killzone 3
Practical Occlusion Culling in Killzone 3Guerrilla
 
Anti-Aliasing Methods in CryENGINE 3
Anti-Aliasing Methods in CryENGINE 3Anti-Aliasing Methods in CryENGINE 3
Anti-Aliasing Methods in CryENGINE 3Tiago Sousa
 
GS-4133, CRYENGINE and AMD bringing the next generation now, by Sean Tracey
GS-4133, CRYENGINE and AMD bringing the next generation now, by Sean TraceyGS-4133, CRYENGINE and AMD bringing the next generation now, by Sean Tracey
GS-4133, CRYENGINE and AMD bringing the next generation now, by Sean TraceyAMD Developer Central
 
Stable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal FilteringStable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal FilteringElectronic Arts / DICE
 
Photogrammetry and Star Wars Battlefront
Photogrammetry and Star Wars BattlefrontPhotogrammetry and Star Wars Battlefront
Photogrammetry and Star Wars BattlefrontElectronic Arts / DICE
 

Viewers also liked (10)

Siggraph2016 - The Devil is in the Details: idTech 666
Siggraph2016 - The Devil is in the Details: idTech 666Siggraph2016 - The Devil is in the Details: idTech 666
Siggraph2016 - The Devil is in the Details: idTech 666
 
Star Ocean 4 - Flexible Shader Managment and Post-processing
Star Ocean 4 - Flexible Shader Managment and Post-processingStar Ocean 4 - Flexible Shader Managment and Post-processing
Star Ocean 4 - Flexible Shader Managment and Post-processing
 
Motion blur
Motion blurMotion blur
Motion blur
 
09_motionblur
09_motionblur09_motionblur
09_motionblur
 
Practical Occlusion Culling in Killzone 3
Practical Occlusion Culling in Killzone 3Practical Occlusion Culling in Killzone 3
Practical Occlusion Culling in Killzone 3
 
Anti-Aliasing Methods in CryENGINE 3
Anti-Aliasing Methods in CryENGINE 3Anti-Aliasing Methods in CryENGINE 3
Anti-Aliasing Methods in CryENGINE 3
 
GS-4133, CRYENGINE and AMD bringing the next generation now, by Sean Tracey
GS-4133, CRYENGINE and AMD bringing the next generation now, by Sean TraceyGS-4133, CRYENGINE and AMD bringing the next generation now, by Sean Tracey
GS-4133, CRYENGINE and AMD bringing the next generation now, by Sean Tracey
 
Stable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal FilteringStable SSAO in Battlefield 3 with Selective Temporal Filtering
Stable SSAO in Battlefield 3 with Selective Temporal Filtering
 
Motionblur
MotionblurMotionblur
Motionblur
 
Photogrammetry and Star Wars Battlefront
Photogrammetry and Star Wars BattlefrontPhotogrammetry and Star Wars Battlefront
Photogrammetry and Star Wars Battlefront
 

Similar to CryEngine 2: Water and Underwater Rendering Techniques

Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...Unity Technologies
 
「原神」におけるコンソールプラットフォーム開発
「原神」におけるコンソールプラットフォーム開発「原神」におけるコンソールプラットフォーム開発
「原神」におけるコンソールプラットフォーム開発Unity Technologies Japan K.K.
 
From Experimentation to Production: The Future of WebGL
From Experimentation to Production: The Future of WebGLFrom Experimentation to Production: The Future of WebGL
From Experimentation to Production: The Future of WebGLFITC
 
Massive Point Light Soft Shadows
Massive Point Light Soft ShadowsMassive Point Light Soft Shadows
Massive Point Light Soft ShadowsWolfgang Engel
 
Paris Master Class 2011 - 05 Post-Processing Pipeline
Paris Master Class 2011 - 05 Post-Processing PipelineParis Master Class 2011 - 05 Post-Processing Pipeline
Paris Master Class 2011 - 05 Post-Processing PipelineWolfgang Engel
 
Oculus insight building the best vr aaron davies
Oculus insight building the best vr   aaron daviesOculus insight building the best vr   aaron davies
Oculus insight building the best vr aaron daviesMary Chan
 
HiPEAC 2019 Workshop - Use Cases
HiPEAC 2019 Workshop - Use CasesHiPEAC 2019 Workshop - Use Cases
HiPEAC 2019 Workshop - Use CasesTulipp. Eu
 
Developing Next-Generation Games with Stage3D (Molehill)
Developing Next-Generation Games with Stage3D (Molehill) Developing Next-Generation Games with Stage3D (Molehill)
Developing Next-Generation Games with Stage3D (Molehill) Jean-Philippe Doiron
 
Technologies Used In Graphics Rendering
Technologies Used In Graphics RenderingTechnologies Used In Graphics Rendering
Technologies Used In Graphics RenderingBhupinder Singh
 
Killzone Shadow Fall Demo Postmortem
Killzone Shadow Fall Demo PostmortemKillzone Shadow Fall Demo Postmortem
Killzone Shadow Fall Demo PostmortemGuerrilla
 
Terrain Rendering using GPU-Based Geometry Clipmaps
Terrain Rendering usingGPU-Based Geometry ClipmapsTerrain Rendering usingGPU-Based Geometry Clipmaps
Terrain Rendering using GPU-Based Geometry Clipmapsnone299359
 
A new Post-Processing Pipeline
A new Post-Processing PipelineA new Post-Processing Pipeline
A new Post-Processing PipelineWolfgang Engel
 
A modern Post-Processing Pipeline
A modern Post-Processing PipelineA modern Post-Processing Pipeline
A modern Post-Processing PipelineWolfgang Engel
 
Terrain in Battlefield 3: A Modern, Complete and Scalable System
Terrain in Battlefield 3: A Modern, Complete and Scalable SystemTerrain in Battlefield 3: A Modern, Complete and Scalable System
Terrain in Battlefield 3: A Modern, Complete and Scalable SystemElectronic Arts / DICE
 
The Technology Behind the DirectX 11 Unreal Engine"Samaritan" Demo
The Technology Behind the DirectX 11 Unreal Engine"Samaritan" DemoThe Technology Behind the DirectX 11 Unreal Engine"Samaritan" Demo
The Technology Behind the DirectX 11 Unreal Engine"Samaritan" Demodrandom
 

Similar to CryEngine 2: Water and Underwater Rendering Techniques (20)

november29.ppt
november29.pptnovember29.ppt
november29.ppt
 
Deferred shading
Deferred shadingDeferred shading
Deferred shading
 
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...
Unite Berlin 2018 - Book of the Dead Optimizing Performance for High End Cons...
 
「原神」におけるコンソールプラットフォーム開発
「原神」におけるコンソールプラットフォーム開発「原神」におけるコンソールプラットフォーム開発
「原神」におけるコンソールプラットフォーム開発
 
From Experimentation to Production: The Future of WebGL
From Experimentation to Production: The Future of WebGLFrom Experimentation to Production: The Future of WebGL
From Experimentation to Production: The Future of WebGL
 
august23.ppt
august23.pptaugust23.ppt
august23.ppt
 
Massive Point Light Soft Shadows
Massive Point Light Soft ShadowsMassive Point Light Soft Shadows
Massive Point Light Soft Shadows
 
Paris Master Class 2011 - 05 Post-Processing Pipeline
Paris Master Class 2011 - 05 Post-Processing PipelineParis Master Class 2011 - 05 Post-Processing Pipeline
Paris Master Class 2011 - 05 Post-Processing Pipeline
 
Oculus insight building the best vr aaron davies
Oculus insight building the best vr   aaron daviesOculus insight building the best vr   aaron davies
Oculus insight building the best vr aaron davies
 
WT in IP.ppt
WT in IP.pptWT in IP.ppt
WT in IP.ppt
 
HiPEAC 2019 Workshop - Use Cases
HiPEAC 2019 Workshop - Use CasesHiPEAC 2019 Workshop - Use Cases
HiPEAC 2019 Workshop - Use Cases
 
Developing Next-Generation Games with Stage3D (Molehill)
Developing Next-Generation Games with Stage3D (Molehill) Developing Next-Generation Games with Stage3D (Molehill)
Developing Next-Generation Games with Stage3D (Molehill)
 
Technologies Used In Graphics Rendering
Technologies Used In Graphics RenderingTechnologies Used In Graphics Rendering
Technologies Used In Graphics Rendering
 
Killzone Shadow Fall Demo Postmortem
Killzone Shadow Fall Demo PostmortemKillzone Shadow Fall Demo Postmortem
Killzone Shadow Fall Demo Postmortem
 
Terrain Rendering using GPU-Based Geometry Clipmaps
Terrain Rendering usingGPU-Based Geometry ClipmapsTerrain Rendering usingGPU-Based Geometry Clipmaps
Terrain Rendering using GPU-Based Geometry Clipmaps
 
Real-time lightmap baking
Real-time lightmap bakingReal-time lightmap baking
Real-time lightmap baking
 
A new Post-Processing Pipeline
A new Post-Processing PipelineA new Post-Processing Pipeline
A new Post-Processing Pipeline
 
A modern Post-Processing Pipeline
A modern Post-Processing PipelineA modern Post-Processing Pipeline
A modern Post-Processing Pipeline
 
Terrain in Battlefield 3: A Modern, Complete and Scalable System
Terrain in Battlefield 3: A Modern, Complete and Scalable SystemTerrain in Battlefield 3: A Modern, Complete and Scalable System
Terrain in Battlefield 3: A Modern, Complete and Scalable System
 
The Technology Behind the DirectX 11 Unreal Engine"Samaritan" Demo
The Technology Behind the DirectX 11 Unreal Engine"Samaritan" DemoThe Technology Behind the DirectX 11 Unreal Engine"Samaritan" Demo
The Technology Behind the DirectX 11 Unreal Engine"Samaritan" Demo
 

Recently uploaded

Decarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityDecarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityIES VE
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersNicole Novielli
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Farhan Tariq
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Strongerpanagenda
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 
Assure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyesAssure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyesThousandEyes
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfIngrid Airi González
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 

Recently uploaded (20)

Decarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a realityDecarbonising Buildings: Making a net-zero built environment a reality
Decarbonising Buildings: Making a net-zero built environment a reality
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software Developers
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 
Assure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyesAssure Ecommerce and Retail Operations Uptime with ThousandEyes
Assure Ecommerce and Retail Operations Uptime with ThousandEyes
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdf
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 

CryEngine 2: Water and Underwater Rendering Techniques

  • 1.
  • 3. CryEngine 2: Shading Overview • Support shader model 2.0, up to 4.0 • Completely dynamic lighting and shadows • Up to 4 point light sources per-pass • Wide range of known and DIY shading models • Some other fancy features • Deferred mix with multi-pass rendering approach • Average of 2K drawcalls per frame (~2M tris)
  • 4. Water and Underwater Rendering • Intro water rendering video
  • 5. Water and Underwater Rendering • Rendering believable looking water • Underwater light-scattering [1] • Water surface animation & tessellation • Reflections/Refraction • Shore/Foam • Caustics and God-rays • Camera and objects interaction with water • Particles • How to make all this efficiently in a very complex and open ended world in a game like Crysis ?
  • 6. No more flat water ! • 3D waves • Used statistical Tessendorf animation model [2] • Computed on CPU for a 64x64 grid • Upload results into a FP32 texture • Vertex displacement on GPU • Lower HW specs used sin waves sum • Additionally 4 moving normals maps layers
  • 7. Surface Tessellation Screen Space Grid Projection Extreme detail nearby Problems Screen edges Aliasing at distance Dropped this approach in the end
  • 8. Surface Tessellation Camera aligned grid Keep detail nearby and in front of camera Problems Camera roll Balancing nearby VS far away detail Kept this approach in the end
  • 9. Surface from a top perspective
  • 10. Physics Interaction • CPU animation is shared with Physics/Game • For lowspec machines, did same “math” as in vertex shader on CPU • Physics samples best water plane fitting object
  • 11. Reflection Per frame, we had an avg of 2K drawcalls (~ 2M tris) • This really needed to be cheap – and look good • Problem: Reflections added about 1500 drawcalls • Draw calls minimization • Final average of 300 drawcalls for reflections • Total internal reflection also simulated • Half screen resolution RT
  • 12. Reflection Update Trickery Update Dependencies • Time • Camera orientation/position difference from previous camera orientation/position • Surface visibility ratio using HW occlusion queries • Multi-GPU systems need extra care to avoid out of sync reflection texture
  • 13. Anisotropic Reflection • Blur final reflection texture vertically • Also helps minimizing reflection aliasing
  • 14. Refraction • No need to render scene again [3] • Use current back-buffer as input texture • Mask out everything above water surface • Water depth > World depth = leaking • Don’t allow refraction texture offset for this case • Chromatic dispersion approx. for interesting look • Scaled offset for R, G and B differently
  • 17. Procedural Caustics • Extra rendering pass, can handle opaque/transparent • Based on real sun direction projection • Procedural composed using 3 layers • Chromatic dispersion approximation • Darken slightly to simulate wet surface
  • 19. Shore and Foam • Soft water intersection • Shore blended based on surface depth distance to world depth • Waves foam blended based on current height distance to maximum height • Foam is composed of 2 moving layers with offsets perturbation by water bump • Acceptable quality
  • 21. Underwater God-Rays [4] • Essentially the same procedural caustics shader • Based on real sun direction projection • Projected into multiple planes in front of camera • Rendered into a 4x smaller than screen RT • Finally add to frame-buffer
  • 22. Light Scattering + Caustics + God-Rays (exaggerated for picture) Underwater God-Rays
  • 23. Camera/Particles Interaction • How to handle case where camera intersects an animated water surface ? • Water droplets effect when coming out of water • When inside water used a subtle distortion • Water particles similar to soft-particles
  • 24. Things for the Future • Rolling waves didn’t made into final game • Special animated water decal geometry • Water splashes • Surface interaction with shoreline • Dynamic surface interaction • Maybe in nearby future project ?   
  • 25. Frozen Surfaces • Intro frozen surfaces video
  • 26. Frozen Surfaces • Huge Headache • Haven’t found previous research on the subject • Unique Alien Frozen World: How to make it ? • Should it look realistic ? • Or an “artistic” flash frozen world ? • Make everything Frozen ? • Dynamically ? • Custom frozen assets ? • Reuse assets ? • Took us 4 iterations until final result
  • 27.
  • 28.
  • 29.
  • 30.
  • 31. Lessons learned Final iteration • Main focus was to make it visually interesting • Used real world images as reference this time • Minimize artist amount of work as much as possible • Impossible to make every single kind of object look realistically frozen (and good) with a single unified approach  • 1 week before hitting feature lock/alpha (gulp)
  • 32.
  • 33. Putting all together • Accumulated snow on top • Blend in snow depending on WS/OS normal z • Frozen water droplets accumulated on side • 2 layers using different uv and offset bump scales to give impression of volume • 3D Perlin noise used for blending variation • 3 octaves and offsets warping to avoid repetitive patterns • Glittering • Used a 2D texture with random noise vectors • Pow( saturate( dot(noise, viewVector), big value) • If result > threshold, multiply by big value for hdr to kick in
  • 34.
  • 35.
  • 36.
  • 37.
  • 38. Procedural Frozen • Reused assets • Dynamic freezing possibility and with nice transition • Didn’t gave artists more control than required • Artists just press button to enable frozen • Relatively cheap, rendering cost wise • Visually interesting results • Only looks believable under good lighting conditions
  • 40. Post-Effects Overview Post Effects Mantra: • Final rendering “make-up” • Minimal aliasing (for very-hi specs) • Never sacrifice quality over speed • Unless you’re doing really crazy expensive stuff ! • Make it as subtle as possible • But not less - or else average gamer will not notice it
  • 41. Camera Motion Blur (CMB) LDR No bright streaks Washed out details HDR
  • 42. Screen-space velocities • Render a sphere around camera • Use previous/current camera transformation to compute delta vector • Lerp between previous/current transformation by a shutter speed ratio ( n / frame delta ), to get correct previous camera matrix • From previous/current positions compute velocity vector • Can already accumulate N samples along velocity direction • But will get constant blurring everywhere
  • 43. Velocity Mask • Used depth to generate velocity mask • We let camera rotation override this mask • Depth is used to mask out nearby geometry • If current pixel depth < nearby threshold write 0 • Value used for blending out blur from first person arms/weapons • Velocity mask is used later as a scale for motion blurring velocity offsets • Blurring amount scales at distance now
  • 44. CMB Vertex Shader Sample vPos.xyz += vWorldViewPos.xyz; float4 vNewPos = mul(mViewProj, vPos); float4 vPrevPos = mul( mViewProjPrev, vPos ); OUT.HPosition = vNewPos; OUT.vCurr = HPosToScreenTC( vNewPos ); OUT.vPrev = HPosToScreenTC( vPrevPos );
  • 45. CMB Pixel Shader Sample half4 cMidCurr = tex2Dproj(screenMap, IN.vCurr); half fDepth = tex2Dproj(depthMap,IN.vCurr).x*NearFarClipDist.y; float2 vStart = IN.vCurr.xy/IN.vCurr.w; float2 vPrev = (IN.vPrev.xy/IN.vVPrev.w) * fScale; float2 vCurr = vStart * fScale; float2 vStep = vPrev - vCurr; float4 accum = 0; [unroll] for(float s = -1.0; s < 1.0 ; s += fWeightStep ) { float2 tcFinal = vCurr.xy - vStep.xy * s; // Apply depth scaling/masking half fDepthMask = tex2D(screenMap, tcFinal).w; tcFinal += vStep.xy * (s - s * fDepthMask); accum += tex2D(screenMap, tcFinal ); } accum *= fWeight; // Remove remaining scene bleeding from 1st player hands
  • 46. Improving quality • Iterative sampling approach • First pass uses 8 samples • Ping-pong results • Second pass uses blurred results, this results in 8 * 8 samples (virtually 64) • 3rd = 512 samples, 4th = 4096, etc • High quality at relatively low cost
  • 48. Optimization strategies • If no movement skip camera motion blur entirely • Compare previous camera transformation with current • Estimate required sample count based on camera translation/orientation velocity • If sample count below certain threshold, skip • Adjust pass/sample count accordingly • This gave a nice performance boost • Average case at 1280x1024 runs at ~ 1 ms on a G80
  • 49. Object Motion Blur (OMB) LDR Bright streaks gone Washed out details HDR Bright Streaks Sharper Details
  • 50. DX9 HW Skinning limitation • 256 vertex constant registers limit • Our characters have an average of 70 bones per drawcall • Each bone uses 2 registers = 140 registers • For motion blur we need previous frame bones transformations • 2 x 140 = 280 registers, bummer.. • Decided for DX10 only solution
  • 51. Step 1: Velocity pass • Render screen space velocity, surface depth and instance ID into a FP16 RT • If no movement, skip rigid/skinned geometry • Compare previous transformation matrix with current • Compare previous bone transformations with current • If per-pixel velocity below certain threshold write 0 to RT • Can use this data for optimizing further
  • 53. Step 2: Velocity Mask • Used a 8x smaller render target • Apply Gaussian blur to velocity length • Result is a reasonable fast estimation of screen space needed for dilation and motion blurring • Mask is used to efficiently skip pixels during dilation passes
  • 54. Step 3: Velocity Dilation • Edge dilation • Done using separated vertical and horizontal offsets • 4 passes total (2 for horizontal, 2 for vertical) • If center offset has velocity or velocity mask is 0 skip processing entirely • Dilate if world depth > surface depth
  • 55. Dilation shader sample float4 vCenterVZID = tex2D(tex0, tc.xy); float fCenterDepth = GetResampledDepth(tex1, tc.xy); float fOffsetScale = tex2D(tex2, tc.xy).x; if( fOffsetScale == 0 || dot(vCenterVZID.xy, vCenterVZID.xy) ){ OUT.Color = float4(vCenterVZID.xyzw) ; return OUT; } [unroll] for( int n = 0; n < nOffsets; n++ ) { float2 tcLookup = tc.xy + vOffsets[n].xy *vScrSizeRecip; float4 vCurrVZID = tex2Dlod(tex0, float4(tcLookup , 0, 0)); float fDepthCmp = saturate( fCenterDepth- vCurrVZID.z ); fDepthCmp *= dot(vCurrVZID.xy, vCurrVZID.xy); fDepthCmp *= Dilated.z == 0; if( fDepthCmp) Dilated = vCurrVZID; }
  • 57. Final Step: Motion Blurring • Accumulate N samples along velocity direction • Can use surface ID to avoid leaking • Extra lookup.. • Clamp velocity to acceptable range • Very important to avoid ugly results, especially with jerky animations • Divide accumulated results by sample count and lerp between blurred/non blurred • Using alpha channel blend mask
  • 58. OMB Pixel Shader Sample float4 cScreen = tex2Dlod(tex0, float4(tc.xy, 0, 0)); float4 cVelocity = tex2Dlod(tex1, float4(tc.xy, 0, 0)); OUT.Color = cScreen; if( dot(cVelocity.xy, cVelocity.xy) < fThreshold ) return OUT; float4 cAccum = 0; float fLen = length(cVelocity.xy); if( fLen ) cVelocity.xy /= fLen; cVelocity.xy *= min(fLen, vMaxRange) //Clamp velocity to MaxRange [unroll] for(float i = 0; i < nSamples; i++) { float2 tcMB = cVelocity * ((i * fRecipSamples)-0.5) + tc; float4 cCurr = tex2Dlod(tex0, float4(tcMB, 0, 0)); cAccum += float4(cCurr.xyz, saturate(10000 * cCurr.w)); } if( cAccum.w ) { // Blend with scene cAccum *= fRecipSamples; OUT.Color = lerp(cScreen, cAccum, saturate( cAccum.w * 2) ); }
  • 61. Object Motion Blur • Object motion blur with per-pixel dilation Not perfect, but good quality results for most cases Geometry independent Problematic with alpha blending • Future improvements / challenges Self-motion blurring Multiple overlapping geometry + motion blurring Could use adaptive sample count for blurring
  • 62. Sun Shafts [5] aka Crepuscular Rays/God Rays/Sun beams/Ropes of Maui…
  • 63. Screen Space Sun Shafts • Generate depth mask • Mask = 1 - normalized scene depth • Perform local radial blur on depth mask • Compute sun position in screen space • Blur vector = sun position - current pixel position • Iterative quality improvement • Used 3 passes (virtually = 512 samples) • RGB = sun-shafts, A = vol. fog shadow aprox • Compose with scene • Sun-shafts = additive blending • Fog-shadows = soft-blend [5]
  • 64. Depth Mask Radial Blurring Depth Mask 8 samples 64 samples 512 samples
  • 66. Color Grading • Artist control for final image touches • Depending on “time of day” in game • Night mood != Day mood, etc • Usual saturation, contrast, brightness controls and color levels like in Far Cry [6] • Image sharpening through extrapolation [9] • Selective color correction • Limited to a single color correction • Photo Filter • Grain filter
  • 68. Selective Color Correction • Color range based on Euclidian distance • ColorRange = saturate(1 - length( src - col.xyz) ); • Color correction done in CMYK color space [8] • c = lerp( c, clamp(c + dst_c, -1, 1), ColorRange); • Finally blend between original and correct color • Orig =lerp( Orig, CMYKtoRGB( c), ColorRange);
  • 69. Photo Filter • Blend entire image into a different color mood • Artist defines “mood color” • cMood = lerp(0, cMood, saturate( fLum * 2.0 ) ); • cMood = lerp(cMood, 1, saturate( fLum - 0.5 ) * 2.0 ); • Final color is a blend between mood color and backbuffer based on luminance and user ratio • final= lerp(cScreen, cMood , saturate( fLum * fRatio));
  • 70.
  • 71.
  • 72.
  • 73.
  • 74. Conclusion • Awesome time to be working in real time computer graphics • Hollywood movie image quality still far away • Some challenges before we can reach it • Need much more GPU power (Crysis is GPU bound) • Order independent alpha blending – for real world cases • Good reflection/refraction We still need to get rid of aliasing – Everywhere
  • 75. Special Thanks • To entire Crytek team • All work presented here wouldn’t have been possible without your support 
  • 76. References • [1] Wenzel, “Real time atmospheric effects”, GDC 2007 • [2] Tessendorf, “Simulating Ocean Water”, 1999 • [3] Sousa, “Generic Refraction Simulation”, Gpu Gems 2, 2004 • [4] Jensen + et al, “Deep Water Animation and Rendering”, 2001 • [5] Nishita + et al, “A Fast Rendering Method for Shafts of Light in Outdoor Scene”, 2006 • [6] Gruschel, “Blend Modes”, 2006 • [7] Bjorke, “Color Controls”, GPU Gems 1 • [8] Green, “OpenGL Shader Tricks”, 2003 • [9] Ford + et al, “Colour Space Conversions”, 1998 • [10] Haerberli + et al, “Image processing by Interpolation and Extrapolation”, 1994
  • 78. Crytek is hiring ! http://www.crytek.com/jobs/

Editor's Notes

  1. Introduce myself – state title, etc Agenda: Introduction Shading Overview - Water / Underwater rendering - Frozen surfaces Post-effects Overview - Camera and per-object motion blur - Screen space sun-shafts - Artist directed color grading Conclusion
  2. - I personally think we’re still at the dawn of what can be done, and especially fotorealism is still very far away for real time – but where getting closer What is a Next-Gen effect ? - Something never done before or done at higher quality bar than existing games - Harmonious composition of art and technology - ”Wow” factor Creating an effect - “Don’t reinvent the wheel” - Visual references and concept art helps a lot - Might take several iterations
  3. CryEngine 2: Shading Overview Shader model 2.0 minimum, up to 4.0 Completely dynamic lighting and shadows Up to 4 point light sources per-pass Wide range of known shading models used: Blinn, Phong, Cook-Torrance, Ward, Kajiya-Kay … Plus custom DIY shading: Skin, Eyes, Cloth, Glass, Vegetation, Frost, Ice … Other fancy features: Parallax Occlusion Mapping, Detail Bump Mapping, Screen Space Ambient Occlusion, 64bit HDR rendering … Deferred mix with multi-pass rendering approach Z-Pass, Opaque-Pass, Transparent/Refractive passes, Material layers, Shadows passes … Average of 2k drawcalls per frame ~ 2 million triangles per frame
  4. Show water intro vid – talk about subjects as video goes
  5. 3D waves = no more flat water! Used statistical Tessendorf animation model Waterworld, Titanic, Pirates of the Caribbean 3… Good tillable results for variety of usage: calm, choppy … Computed on CPU for a 64x64 grid Resolution was enough for us and very fast to compute Pirates of the Caribbean 3 used 2k x 2k Upload results into a R32G32B32A32F texture 64x64 size Vertex displacement on GPU 1 bilinear fetch on DX10 (4 fetches for DX9) Lower HW specs used simple sin waves sum Additionally normals maps translation – 4 normal maps layers Hi frequency/ low frequency Animated along water flow direction
  6. So, in order to have nice water surface animation, the first step is to have good water tessellation Our first try: Screen Space Grid Projection 100k triangles grid, generated only once Project grid on water plane (GPU) Good: Very detailed nearby Bad: Unacceptable aliasing (shading and position) at distance and with slow camera movements In a FPS it’s extremely distracting Problematic on screen edges Had to attenuate displacement (xyz) on edges Which limited displacement size Dropped this approach in the end
  7. Camera-aligned grid 100k triangles grid, generated only once Grid edges extruded into horizon Attenuate displacement at grid edges Idea/Purpose was to keep most triangles nearby and in front of camera Minimizes aliasing with camera rotation Eliminated translation aliasing through snapping Problematic with camera roll Trouble balancing nearby VS far away detail But, kept this approach in the end
  8. Picture showing from a top view. Btw: Lines are from degenerate triangles used for connecting triangle strips
  9. Physics Interaction With animation computed on CPU we can directly share results with physics engine For lower specs case, we did exact same math from vertex shader on CPU For each object in ocean, physics engine samples ocean below object and computes a plane that represents ocean surface best This allowed player/objects/vehicles to be affected by water surface animation
  10. Reflection Overview Per frame, we had an avg of 2K drawcalls (~ 2M tris) This really needed to be cheap, but still look good Problem: Reflections added about 1500 drawcalls Decreased draw calls count by: Decreasing view distance Set near plane to nearest terrain sector Reflection update ratio trickery Final average of 300 drawcalls for reflections Underwater total internal reflection is also simulated Phenomenon that happens when whiting a medium like water/glass where rays of light get completely reflected – happens when the angle of incidence is greated than a certain limiting angle (= critical angle) But limited due players complains about lack of visibility Reflection texture is half screen resolution
  11. Reflection Update Trickery Update Dependencies Time Camera orientation/position difference from previous camera orientation/position Surface mesh visibility ratio using HW occlusion queries If barely visible, decrease update ratio significantly Multi-GPU systems need extra care to avoid out of sync reflection texture Update only for main GPU, use same reflection camera for the following N frames (N way Multi-GPU)
  12. Blur final reflection texture vertically Blur ratio = camera distance to water plane (nearest = blurriest/biggest offset scale) As an extra, it helps minimizing reflection aliasing
  13. Refraction Use current backbuffer as input texture - No need to render scene again - Mask out everything above water surface instead Check “ Generic Refraction Simulation ”, Gpu Gems 2 Extended Refraction Masking concept with more general/simpler approach using depth instead - Water depth &gt; World depth, then don’t allow refraction texture offset - Using chromatic dispersion, means doing this 3 times - Alpha-blend works since we store 2 render lists, before/after water Chromatic dispersion used for interesting look Scaled offset for R, G and B differently Incidentally reflection/refraction done in HDR
  14. Procedural Caustics Very important for realistic water look Extra rendering pass, can handle opaque/transparent – could have done in deferred way, but we lacked some data like per pixel normals to give interesting look Based on real sun direction projection Procedural composed using 3 layers Chromatic dispersion used Darken slightly to simulate wet surface Stencil-pre pass helps a little bit on performance
  15. Shore and Foam Very important to make water look believable Soft water intersection Shore blended based on surface depth distance to world depth Waves foam blended based on current height distance to maximum height Foam is composed of 2 moving layers with offsets perturbation by water bump Acceptable quality
  16. Camera Interaction How to handle case where camera intersects an animated water surface ? Reflections are planar = &gt; Reflection plane adjusted based on water height at camera position Fog is computed for a plane, not for animated surface =&gt; Did same thing for fog plane Water surface soft-intersection with camera Water droplets effect when coming out of water volume When inside water volume used a subtle distortion Helps feeling inside water volume And also helps disguising fact that shadows are not refracted Water particles similar to soft-particles Soft-particles = use particle depth and world depth to soften intersection with opaque geometry For water particles we also do soft-intersection with water plane Fake specular highlights
  17. Show into video (walking around, weapon frost, glittering, etc) while talking about subject Video with ICE level, show disabling/enabling frozen layer
  18. Now we’ll go into the first case study: the frozen surfaces - Also known as The Biggest Headhache – ever. - This part of the lecture is kind of a mini-post mortem to the frozen surfaces in crysis, and hopefully will be helpful for someone trying to make a frozen world. - So, this was a big challenge, partially due to not having found any previous research on the subject, at least I did not find any relevant information - And because we had no clear idea/direction Should it look realistic ? Or an “artistic” flash frozen world ? Make everything Frozen ? Should we be able to make any object look frozen ? Dynamically ? Custom frozen assets ? Reuse assets ? Several iterations until final result (4 ?)
  19. 1st try ! Artists made custom frozen textures/assets A lot of work for artists No dynamic freezing possibility Resource hog to have frozen and non-frozen on same level Not very good results
  20. 2nd try: We made a specific shader for simulating frozen surfaces Reused assets Dynamic freezing possibility and with nice transition Artists wanted too many parameters exposed Which resulted in a lot of work for artists Expensive rendering wise, extra drawcall, blending, too many parameters = too many constants Not very good results Show 2nd gen video
  21. For 3rd iteration We tried to get best of both worlds Reused most assets Dynamic freezing possibility Cheaper than fully procedural approach Some cases looked nice – for example the palmtree leafs Artists wanted too much parameters/control Which resulted in a lot of work for artists Although better than previous tries, results where still visually uninteresting
  22. Until we reached our final iteration, the one showed on the video previously
  23. What we learned from previous approaches Main focus was to make it now visually interesting Used real world images as reference this time Minimize artist amount of work as much as possible Impossible to make every single kind of object look realistically frozen (and good) with a single unified approach  Additionally we had 1 week before hitting feature lock/alpha milestone (gulp) – to make this happen
  24. These are actually the real references we used for the 4th and final generation. From references we can see: Accumulated snow on top Frozen water droplets accumulated on side Subtle reflection We need variation and no visible patterns Glittering
  25. Putting all together This final iteration, ended up being the simplest concept wise Accumulated snow on top Blend in snow depending on WS/OS normal z Frozen water droplets accumulated on side 2 layers using different uv and offset bump scales to give impression of volume Normal map texture used was really a water droplets texture btw 3D Perlin noise used for blending variation (using volumetric texture) 3 octaves and offsets warping to avoid repetitive patterns Glittering Used a 2D texture with random noise vectors Pow( saturate( dot(noise, viewVector), big value) If result &gt; threshold, multiply by big value for hdr to kick in – for this trick to work nicely was only possible since we do blooming and image rescaling carefully to minimize any aliasing – therefore when we have a single pixel on screen we don’t see any schimering
  26. Procedural Frozen Reused assets Dynamic freezing possibility and with nice transition Don’t give artists more control than they really need Artists just press button to enable frozen If required they can still make final material tweaks Relatively cheap, rendering cost wise – frozen layer replaces general pass for case with no frost transition Visually interesting results Only looks believable under good lighting conditions
  27. Post-Effects Overview Post Effects Mantra: Final rendering “make-up” Minimal aliasing as possible for this hardware generation (for very-hi specs) = No shimmering, no color banding, no visible sampling artifacts, etc For example like I mentioned in Frost Glitter case, for blooming we went the “extra mile” to make sure no shimmering was visible at all Never sacrifice quality over speed Unless you’re doing really crazy expensive stuff ! Find instead cheaper ways of doing same work, for less cost Make it as subtle as possible We’ve all seen Uber-Glow in a lot of games – or over usage of Brown/Desaturation for color palettes But not less - or else average gamer will not notice it
  28. Motion Blur: we have 2 techniques, one for camera motion blur and other for objects motion blur. We’ll start first with camera motion blur Done in HDR for very hi specs Render a sphere around camera Used scene-depth for velocity scaling Mask out nearby geometry (first person arms/weapons) Maximizing quality through iterative sampling Optimization strategies
  29. Screen-space velocities Render a sphere around camera Use previous/current camera transformation for computing previous sphere vertices positions But we need framerate independent motion blur results Lerp between previous/current transformation by a shutter speed ratio ( n / frame delta ), to get correct previous camera matrix – this is done on CPU From previous/current positions compute velocity vector Can already accumulate N samples along velocity direction And divide accumulation by sample count But will get constant blurring everywhere
  30. Velocity Mask Used depth to generate velocity mask Stored in scene RT alpha channel We let camera rotation override this mask For fast movements feels more “cinematic” Depth is used to mask out nearby geometry If current pixel depth &lt; nearby threshold write 0 Value used for blending out blur from first person arms/weapons Velocity mask is used later as a scale for motion blurring velocity offsets Blurring amount scales at distance now
  31. Optimization strategies If no movement skip camera motion blur entirely Compare previous camera transformation with current Estimate required sample count based on camera translation/orientation velocity If sample count below certain threshold, skip Adjust pass/sample count accordingly This gave a nice performance boost Average case at 1280x1024 runs at ~ 1 ms on a G80 VS 5 ms without Final note regarding camera motion blur Beware, that most hardcore gamers dislike motion blur and will disable it  Giving gamers the option to control motion blur strength (shutter speed) helps Crysis has this since patch1
  32. Done in HDR DX9 limitations for characters Render Velocity/Depth/ID into a RT Velocity Mask Per-pixel dilation Motion blur using final dilated velocities Quality improve through iterative sampling Further improvements
  33. DX9 HW Skinning limitation Limit of 256 vertex constant registers Big problem when doing HW skinning Our characters have an average of 70 bones per drawcall Each bone uses 2 registers = 140 registers For motion blur we need previous frame bones transformations So… 2 x 140 = 280 registers And lets not forget all other data needed = Bummer Decided for DX10 only solution Could have skipped problematic skinned geometry, but would not look consistent Or making it work on dx9 would have increased significantly drawcall count (subdivide mesh) Could have used special mesh geometry/shell with less bones = Cumbersome
  34. Step 1: Velocity pass Render screen space velocity, surface depth and instance ID into a R16G16B16A16F RT In our case this was a shared screen size RT, that gets resized after into a 2x smaller RT With a forward rendering approach, is not a good idea todo this for every single object in the world So if no movement, skip rigid/skinned geometry Compare previous transformation matrix with current Compare previous bone transformations with current, this is a simple abs( dot(prev, curr) ) accumulation If per-pixel velocity bellow certain threshold write 0 to RT When time to use RT comes just skip pixel processing entirely
  35. More natural/visual appealing result Dilation possibilities ? Dilate geometry itself Check “Opengl Shader Tricks” Could just apply a simple blur to velocity map Very fast – not very good results though Do dilation per pixel as post process
  36. Final Step: Motion Blurring Accumulate 8 samples along velocity direction Blend mask in alpha channel Can use surface ID to avoid leaking Extra lookup Not really noticeable during high speed gaming – but might be very important for some specific cases/games – with some kind of bullet time or for some slowmotion cutscene Clamp velocity to an acceptable range Very important to avoid ugly results, specially with jerky animations Divide accumulated results by sample count and lerp between blurred/non blurred Using alpha channel blend mask
  37. This is the final blend mask Only tagged pixels get processed further
  38. Object Motion Blur Object motion blur with per-pixel dilation Not perfect, but high quality results for most cases Geometry “independent” – no need for geometry expansion Problematic with alpha blending – how to handle ? Similar to refraction problem. We used nearest alpha blended surface, but stuff behind will not work. Could go crazy, and after every alpha blended surface re-use backbuffer and do motion blur. Will never be 100% with &lt;= DX10. ~ 2.5 ms at 1280x1024 on a G80 Future improvements / challenges - Self-motion blurring - Multiple overlapping geometry + motion blurring - How to handle alpha blended geometry nicely ? - Could use adaptive sample count for blurring, like in camera motion blur case ( low speed = less samples, high speed = more samples) - If still time show video frame by frame using virtualDUB – show artefacts with fast camera movement due to masking and overlapping geometry due to not doing dilation on occluding geometry
  39. Sun Shafts aka Crepuscular Rays/God Rays/Sun beams/Ropes of Maui/ GodLight / Rays of Buddha/Light Shafts…
  40. Very simple trick Generate depth mask Math Perform local radial blur on depth mask Compute sun position in screen space Blur vector = sun position - current pixel position Use blur vector for blurring copy of backbuffer RGB = sun-shafts, A = volumetric shadows aprox Iterative quality improvement Used 3 passes (512 samples) Compose with scene Correct way would be computing how much this pixel is on shadow and attenuate fog density - but this cheap blending approximation delivered already nice results
  41. Sun Shafts Works reasonably well if sun onscreen or nearby Screen edges problematic: - to minimize strength is attenuated based on view angle with sun direction - additionally could attenuate edges
  42. Color Grading Artist control for final image touches Depending on “time of day” in game Night mood != Day mood != Rainy mood, etc Usual saturation, contrast, brightness controls and color levels like in Far Cry Image sharpening through extrapolation Selective color correction Limited to a single color correction Photo Filter Grain filter Simple random noise blended in by a small ratio
  43. Image sharpening through extrapolation Simply lerp between low-res blurred image with original image by a ratio bigger than 1 Sharp = lerp(blurred, orig, bigger than 1 ratio)
  44. Selective Color Correction This was very useful for us, since it allowed artists to fix individual colors and end of Crysis without having to re-tweak lighting (which takes much much longer) Color range based on Euclidian distance ColorRange = saturate(1 - length( src - col.xyz) ); Color correction done in CMYK color space Similar to Photoshop = Happy artist c = lerp( c, clamp(c + dst_c, -1, 1), ColorRange); Finally blend between original and correct color Orig =lerp( Orig, CMYKtoRGB( c), ColorRange);
  45. Photo Filter (need real images) Blend entire image into a different color mood – so artists where able to give a warmer/colder color mood – again without having to re-tweak entire lighting settings Artist defines “mood color” Final mood color is based on luminance cMood = lerp(0, cMood, saturate( fLum * 2.0 ) ); cMood = lerp(cMood, 1, saturate( fLum - 0.5 ) * 2.0 ); Final color is a blend between mood color and backbuffer based on luminance and user ratio final= lerp(cScreen, cMood , saturate( fLum * fRatio)); Example: Left – original, right 50% mix with a orange’ish color – colors feel much warmer
  46. Conclusion Awesome time to be working in real time computer graphics Hollywood movie image quality still far away Some challenges before we can reach it Need much more GPU power (Crysis is GPU bound) Order independent alpha blending – for real world cases Good reflection/refraction: cube maps only take us *so far, we need a generalized solution, imagine a river flowing down a mountain, impossible to make it look good with cubemaps/impossible with planar reflection. Good GI: IL / AO / SSS We still need to get rid of aliasing – Everywhere – no popping, no ugly sharp edges, no shimmering, no shadow aliasing on self-shadowing, etc, etc – lots of room for improvement here – We don’t see any of this when watching a movie or in real life right ? I guess prbly everyone has their own private list 