Rendering in games

37  Download (0)

Full text

(1)

Rendering

Scena 3D

rendering

Immagine

screen buffer ( array 2D di pixel )

Rendering in games

Real-time

(20 or) 30 or 60 FPS

Algorithm:

rasterization based rendering

Hardware based

pipelined architecture, parallel

Rendering primitives:

mostly triangles (lines and points possible too)

Complexity:

Linear with number of primitives

(2)

Rendering:

rasterization of trianges

x

y z

v

0

=( x

0

, y

0

, z

0

)

v

1

=( x

1

, y

1

, z

1

) v

2

=( x

2

, y

2

, z

2

)

GPU pipeline (shown: OpenGL 2.0)

(3)

GPU pipeline – simplified

GPU pipeline – simplified more

(4)

GPU pipeline – simplified even more

vertici 3D

fragment

process pixels finali

"frammenti"

(fragments) transform

z x

v0 v1

v2

rasterizer y

triangolo 2D a schermo

(2D screen triangle) v0 v1

v2

10

Rasterization based rendering:

stages

Per vertex: (vertex shader)

skinning (from rest pose to current pose)

transform(from object space to screen space)

Per triangle: (rasterizer)

rasterization

interpolation of per-vertex data

Per fragment: (fragment shader)

lighting(from normal + lights + material to RGB)

texturing

alpha kill

Per fragment: (output combiners)

depth test

alpha blend

(5)

Rasterization-Based Rendering

vertices3D

fragmentper final pixels

"fragments"

vertexper

z x

v0 v1

v2

triangleper y

2D triangle on screen

v0 v1

v2

PROGRAMMABLE!12

Rasterization-Based Rendering

vertices3D

fragmentper final pixels

"fragments"

vertexper

z x

v0 v1

v2

triangleper y

2D triangle on screen

v0 v1

v2

13

A user-defined

"Vertex Shader"

(or vertex program)

A user-defined

"Fragment Shader"

(or pixel program)

(6)

Shading languages

High level:

GLSL- OpenGL Shading Language (by Khronos)

HLSL- High Level Shader Language (Direct3D, by Microsoft)

CG - C for Graphics (by Nvidia)

Low lever:

ARB Shader Program

(an “assembler” for GPU -- deprecated)

In Unity

(and, similarly, in many game engines)

Meshes have a “mesh renderer” component

includes several flags and settings and…

Mesh renderer have a “material” component

include flags, material parameters settings, textures and…

Material in include a “shader”

determines which settings/texture are available in material

can be one of the many “standard shader”

can be a customized shader: use “shader-lab”

(7)

In Unity: ShaderLab

A text file defining shaders

and describing how the engine should use them

Defines

A set of shaders to link (vertex, fragment…)

in CG language

Fallback shaders

(a “plan B” for when the running HW does not support the default shader)

Connection of material parameters / textures …

(visible to scripts / Unity GUI)

…to shaders uniforms

(basically, global constant usable in shaders)

Rendering effects:

lighting

(8)

Local lighting

LIGHT

EYE OBJECT

reflection (BRDF)

Lighting

Material parameters

(data modelling the «material»)

Illuminant

(data modelling lighting environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

L IG H T IN G M O D E L

final R, G, B

( the lighting equation )

(9)

Lighting equations

Many different equations…

Lambertian

Blinn-Phong

Beckmann

Heidrich–Seidel

Cook–Torrance

Ward(anisotropic)

add Fresnel effects

Varying levels of

complexity

realism

(some are physically based, some are… just tricks)

material parameters allowed

richness of effects

simplest, most commonly used

to learn more, see Computer Graphics course!

Lighting equations:

most basic solutions

Diffuse (aka Lambertian)

physically based

only dull materials

only material parameter:

base color

(aka albedo, aka “diffuse” color)

Specular (aka Blinn-Phong)

just a trick

add simulated reflections (highlights)

additional material parameters:

specular intensity (or, color)

specular exponent (aka glossiness)

(10)

Lighting:

per-pixel VS per-vertex

Per pixel = more quality

computation in the fragment shader

interpolate lighting input

material params can be in textures (more variations)

Per vertex = more efficiency

compiutation in the vertex shader

interpolate lighting output

material params must be in vertices (few variations)

Usually: mixed!

partly per vertex (e.g. diffuse, local light dir computation)

partly per pixel (e.g. specular, env. map, shadow-map)

Many effects require per-pixel:

normal mapping

parallax mapping …

Lighting

Material parameters

(data modelling the «material»)

Illuminant

(data modelling lighting environment)

Geometric data

(e.g. normal, tangent dirs, pos of viewer)

L IG H T IN G M O D E L

final R, G, B

( the lighting equation )

Illuminant

(data modelling lighting environment)

(11)

Illumination environments:

discrete

a finite set of “light suroces”

few of them (usually 1-4)

each sitting in a node of the scene graphs

types:

point light sources

with: position

spot-lights

with: position,

orientation, wideness (angle)

directional light sources

with: orientation

extra attributes:

color / intesity

(other minor attributes)

Illumination environments:

densely sampled

From each direction (on the sphere) a light intensity / color

Asset to store that:

“Environment map”

(or “Reflection Probe”)

θ φ

180 -180

90

-90

(12)

Typical issue with lights in games: too many of them

Each light has a cost:

compute a term in the Lighting Equation (for each vertex or fragment !)

access all its parameters in the shaders…

maybe: compute its shadows (!!!)

1..4 lights: ok

20+ lights: not ok.

But, potentially needed?

physically speaking,

a light source has infinite range of effect

Typical issue with lights in games: too many of them

Solution: light proxies

full qualityfor the (e.g.) 4 most relevant lights

how to pick them? (per object) the closest ones

the brightest ones

the dynamic ones (as opposed to static)

for them: shadows, full per-pixel…

approximate the others lights

no shadows, per vertex…

aggregate them in Env map / light probes

populate the scene with ad-hoc light probes

just ignorethe least relevant ones

artificially finite radius of lights

(13)

Spherical functions

(a common requirement in lighting)

Task: how to store a function f: Ω  R

n

Ω = surface of a sphere

i.e. the set of all unit vectors (directions)

Rn= some vector space

(scalars, colors, vectors…)

Examples:

a (local) lighting environment (at a position p)

f( x ) = how much light comes in p from direction x

the lighting radiance (of a point p)

f( x ) = how much light p reflects toward direction x

local occlusions (for a point p)

f( x ) = is p seen from direction x ? in [0 , 1]

We want efficient storage, synthesis,

computation of f, + ability to interpolate them

Spherical function:

by sampling

Idea: just sample f (i.e. store it as a table)

Step 1: parametrize a sphere into domain A

use a fixed function m: Ω  A A = typically, a rectangle

m must be fast, and not distorted

common choices for m ?

Step 2: regularly sample A (as an image)

Then:

Store f : just store the image

To get f( x ): access A at position m( x ) (use bilinear interpolation or better)

To interpolate between two f:

just cross-fade the two images

(14)

Spherical function:

with Spherical Harmonics

Local lighting

Material parameters

(data modelling the «material»)

Illuminat

(data modelling lighting environment)

Geometric data

(e.g. normal, tangent dirs, pos of viewer)

L IG H T IN G M O D E L

final R, G, B

( the lighting equation )

Material parameters

(data modelling the «material»)

(15)

Material parameters

GPU rendering of a Mesh in a nutshell

(reminder)

Load…

store all data on GPU RAM

Geometry + Attributes

Connectivity

Textures

Shaders

Material Parameters

Rendering Settings

…and Fire!

send the command: “do it” !

THE MESH ASSET

THE MATERIAL ASSET

(16)

Terminology

Material parameters

parameters modelling the optical behavior of physical object

(part of) the input of the lighting equation

Material asset

an abstraction used by game engines

consisting of

a set of textures (e.g. diffuse + specular + normal map)

a set of shaders (e.g. vertex + fragment)

a set of global parameters (e.g. global glossiness)

rendering settings (e.g. back face culling Y/N?)

corresponds a state of the rendering engines

Authoring

material parameters

Q: which materials parameters needs be defined?

A: depends on chosen lighting equation

Idea:

game engine lets material artist choose intuitively named material parameters, then picks a lighting equation accordingly

the one best suiting them

“speak material-artist language”

(17)

Authoring

material parameters

Popular choice of “intuitive parameters”:

Base color (rgb)

Specularity (scalar)

“Metal-ness”(scalar)

Roughness (scalar)

images: unreal engine 4

PBM

“Physically Based Materials”

Basically, just a buzzword 

Meanings:

1. use accurate material parameters

physically plausible

maybe measured

instead of: made up and tuned by intuition (by the material artist)

2. keep each lighting element separated (e.g. its own texture)

use fewer shortcuts that usual

e.g. use:

base color: one texture

baked AO: another texture (“Geometry Term”)

instead of:

base color x baked AO : one texture AO = Ambient Occlusion (see later)

(18)

PMS

“Physically Based Shading”

Basically, just another buzzword 

Meanings:

1. Use PBM

2. Use a more complex,

more adherent to reality Lighting equation E.g.

include HDR (and Gamma-corrected rendering)

include Fresnel effects

energy conserving Lighting equations only

General objective:

Make a material look plausible

under a larger range of lighting environments

(much more challenging than targeting just one or a few!)

Local lighting in brief

Material properties

(data modelling the «material»)

Illuminat

(data modelling lighting environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

(19)

Reminder: normals

Per vertex attribute of meshes

Reminder:

Tangent dirs

normal mapping (tangent space) requires tangent dirs

«anisotropic»

BRDF:

requires tantent dir

(20)

Material qualities:

it’s improving fast

indie 2006 indie 2010

Material qualities: improving

(21)

Local lighting in brief

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Material properties

(data modelling the «material»)

Illuminat

(data modelling lighting environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

Lighting equation:

how

Computed in the fragment shader

most game engine support a subset as default ones

any custom one can be programmed in shaders!

Material + geometry parameters stored :

in textures (highest freq variations)

in vertex attributes (smooth variations)

as “material assets” parameter (no variation)

for example, where are

diffuse color

specular color

normals

tangent dirs typically stored?

(22)

How to feed parameters to the lighting equation

Hard wired choice of the game engine

WYSIWYG game tools

E.g. in Unreal Engine 4

Multi-pass rendering Basic mechanism

Pass 1:

the resulting screen-buffer is stored in a texture

(not sent on the screen)

Pass 2:

the final rendering uses the screen-buffer as a texture

The buffer is

write only in pass 1

read only in pass 2

The two passes be completely different

different settings, points of view, resolution…

Sometimes: more passes than 2

Sometimes: pass 1 produces more than 1 buffer in parallel

(23)

Multi-pass rendering Examples

Many custom effects like:

Mirrors:

Pass 1: produces what is seen in a mirror

Pass 2: the mirror surface is textured with it

An animated painting (think harry potter):

Pass 1: produces the painting content

Pass 2: in the main scene, the painting is textured with it

Portals in “Portals” serie (Valve)

We will see a few standard effects requiring Multi-pass rendering (such as: shadow-maps)

One sub-class of multi-pass rendering is Screen space effects

Screen-space effect Basic mechanism

Pass 1:

the scene is rendered

From the main camera point of view

Produces: a RGB buffer

Produces: a depth buffer

… sometimes, other buffers too (“multiple render targets”)

Pass 2:

one big quad is rendered, covering the screen exactly

uses the produced buffer(s) as texture(s)

adding all kinds of effects (e.g.: blur?)

Basically, it’s “post-production”… in real time

(24)

Rendering techniques popular in games

Shadowing

shadow mapping

Screen Space Ambient Occlusion

Camera lens effects

Flares

limited Depth Of Field

Motion blur

High Dynamic Range

Non Photorealistic Rendering

contours

toon BRDF

Texture-for-geometry

Bumpmapping

Parallax mapping

SSAO

DoF

HDR

NPR con PCF

Shadow mapping

(25)

Shadow mapping

Shadow mapping in a nutshell

Two passes.

1st rendering: camera in light position

produces: depth buffer

called:

the shadowmap

2nd renedring:

camera in final position

for each fragment

access

the shadowmap once to determine

if fragment is reached by light or not

(26)

Shadow mapping in a nutshell

OCCHIO LUCE

SHADOW MAP

final SCREEN BUFFER

Shadow Mapping:

costs

Rendering the shadowmap:

can be kept minimal!

no color buffer writing (no lighting, texturing…)

just: vertex transform, and depth test

optimizations: view-frustum culling

still, it’s a costly extra pass (for each light )

do only for important lights

can be baked once and reused, for static objects

(yet another good reason to tag them)

requires static lights too

(27)

Shadow Mapping:

issues

Shadow-map bit-depth:

quantizationartifacts matters! 16bit is hardly enough

Shadow-map resolution:

aliasingartifacts matters!

remedies: higer res, PCF, multi-res shadow-map

Screen Space AO

(28)

Screen Space AO

OFF

Screen Space AO

ON

(29)

Screen Space AO in a nutshell

First pass: standard rendering

produces: rgb image

produces: depth image

Second pass:

screen space technique

for each pixel, look at depth VS its neighbors:

neighbors are in front?

difficult to reach pixel: partly negate ambient light

neighbors are behind?

pixel exposed to ambient light: more ambient light

(limited)

Depth of Field

depth in focus range:

sharp depth

out of focus range:

blurred

(30)

(limited) Depth of Field in a nutshell

First pass: standard rendering

rgb image

depth image

Second pass:

screen space technique:

pixel is inside of focus range? keep it sharp

pixel is outside of focus range? blur it

(blur = average with neighbors pixels kernel size ~= amount of blur)

HDR - High Dynamic Range

(limited Dynamic Range)

(31)

HDR - High Dynamic Range in a nutshell

First pass: normal rendering,

BUT use lighting / materials with HDR

pixel values not in [0..1]

e.g. sun emits light with = RGB [500,500,500]:

>1 = over-exposed ! “whiter than white”

Second pass:

screen space technique:

>1 values bleed into neighbors

i.e.: overexposed pixels lighten neighbors

i.e.: they will be max white (1,1,1), and their light bleeds into neighbors

Parallax Mapping

Normal map only

(32)

Parallax Mapping

Normal map + Parallax map

Parallax mapping:

in a nutshell

Texture-for-geometry technique

like a normal-maps (used in conjunction to it)

Requires a displacement map:

texel = distance from surface

(33)

Motion Blur

NPR rendering /

Toon shading / Cel Shading

(34)

NPR rendering:

Toon shading / Cel Shading

Toon shading / Cel Shading in a nutshell

Simulating “toons”

Typically, two effects:

add contour lines

at discontinuity lines of:

1. depth, 2. normals, 3. materials

quantize lighting:

e.g. 2 or 3 tones: light, medium, dark instead of continuous interval

it’s a simple variation of lighting equation

(35)

NPR rendering:

simulated pixel art

img by Howard Day (2015)

NPR rendering:

simulated pixel art

img by Dukope

(36)

NPR rendering:

simulated pixel art

img by Dukope

Multi-pass rendering in Unity (notes)

Very simple to do (as usual)

Steps:

create a Render Texture

a 2D GPU buffer which can be: the output of a rendering, OR the texture of another rendering

create one (secondary) Camera

add the Render Texture as the “Target Texture” of this Camera

that camera won’t output its rendering to the screen

add the Render Texture as a “Texture”, in some material

What happens:

every frame, Unity will:

1stpass: render the texture (from the secondary camera)

2ndpass: use the result in the (final) rendering (from the main camera)

(37)

Screen Space effects in Unity (notes)

Very simple to do (as usual)

Steps:

Create a Shader(ShaderLab)

pick a “image effect” shader (just to save initialization work)

Create a Material, which uses the shader

Add a Scriptto the main camera

add a public Material field to it

assign it to the new material (from the GUI)

redefine its “OnRenderImage” method

make it just do one blitoperation (see next slide)

…using the material as parameters

All ready: the effect can now be coded in the Fragment shader of the Shader

(multiple?) accesses the texture(s), computation of final RGB

Screen Space effects in Unity (notes)

using System.Collections;

using System.Collections.Generic;

using UnityEngine;

public class CameraScript : MonoBehaviour { void Update () { }

public Material mat;

void OnRenderImage ( RenderTexture src , RenderTexture dest ) {

Graphics.Blit (src, dest, mat);

} }

public Material mat;

Graphics.Blit (src, dest, mat);

“blit” = 2D screen-buffer copy

In Unity, implemented as a full-screen quad rendering

Figure

Updating...

References

Related subjects :