• Non ci sono risultati.

Video Game Dev -Univ Insubria 2017/201819/12/2017Marco Tarini1Video Game Dev 2017/2018Univ. InsubriaRendering in gamesRendering3D sceneImagerenderingscreen buffer( array 2D di pixel )

N/A
N/A
Protected

Academic year: 2021

Condividi "Video Game Dev -Univ Insubria 2017/201819/12/2017Marco Tarini1Video Game Dev 2017/2018Univ. InsubriaRendering in gamesRendering3D sceneImagerenderingscreen buffer( array 2D di pixel )"

Copied!
44
0
0

Testo completo

(1)

Video Game Dev 2017/2018 Univ. Insubria

Rendering in games

Rendering

3D scene

rendering

Image

screen buffer ( array 2D di pixel )

(2)

Rendering in games

Real time

(20 o) 30 o 60 FPS

Hardware based

Pipelined, stream processing

therefore: class of algorithms:

rasterization based

Complexity:

Linear with # of primitives

Linear with # of pixels

This lecture:

a bird-eye view on…

Graphic Hardware & HW based rendering

a brief summary

Lighting

Local Lighting

Lighting equations notes

Lighting environments

Materials

Strategies to approximate Global Lighting

Ad-hoc rendering techniques used in games

Multi-pass techniques in general

Screen space techniques in general

A summary of a few common game rendering techniques

(3)

Reminder

BUS

5

CPU

ALU

(central)

RAM

Disk Scheda video

bus interno bus interno

(scheda video) (sch. video)RAM GPU

A triangle

x

y z

v

0

=( x

0

, y

0

, z

0

)

v

1

=( x

1

, y

1

, z

1

)

v

2

=( x

2

, y

2

, z

2

)

(4)

GPU pipeline – simplified

(5)

GPU pipeline – simplified more

GPU pipeline – simplified even more

3D vertex (e.g. of a mesh)

fragment

process pixels finali fragments

(“wanna be pixel”) transform

z x

v0 v1

v2

rasterizer y

2D screen triangle

v0 v1

v2

10

(6)

base schema

Per vertex: (vertex shader)

skinning (from rest pose to current pose)

transform(from object space to screen space)

Per triangle: (rasterizer)

rasterization

interpolation of per-vertex data

Per fragment: (fragment shader)

lighting(from normal + lights + material to RGB)

texturing

alpha kill

Per pixel: (after the fragment shader)

depth test

alpha blend

Rasterization-Based Rendering

vertices3D

fragmentper final pixels

"fragments"

vertexper

z x

v0 v1

v2

triangleper y

2D triangle on screen

v0 v1

v2

PROGRAMMABLE!12

(7)

Rasterization-Based Rendering

vertices3D

fragmentper final pixels

"fragments"

vertexper

z x

v0 v1

v2

triangleper y

2D triangle on screen

v0 v1

v2

13

The user-defined

"Vertex Shader"

(or vertex program)

The user-defined

"Fragment Shader"

(or pixel program)

Shading languages

High level:

HLSL(High Level Shader Language, Direct3D, Microsoft)

GLSL(OpenGL Shading Language)

CG(C for Graphics, Nvidia)

Low lever:

ARB Shader Program

(the “assembler” of GPU – now deprecated)

(8)

Local lighting

Local lighting

LIGHT

EYE OBJECT

reflection (BRDF)

(9)

Local lighting

Material parameters

(data modelling the «material»)

Illuminant

(data modelling the Lighting Environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Lighting equations

Different equations can be employed…

Lambertian

Blinn-Phong

Beckmann

Heidrich–Seidel

Cook–Torrance

Ward (anisotropic)

+ additional Fresnel effect

Varying levels of

complexity

realism

(some are physically based, some are… just tricks)

material parameters allowed

richness of effects basic

to learn more,

see Computer

Graphics course!

(10)

basics

Diffuse factor (aka Lambertian)

physically based

only dull materials

only material parameter:

base color

(aka albedo, aka “diffuse” color)

Specular factor (aka Blinn-Phong)

just a trick

add simulated reflections (highlights)

additional material parameters:

specular intensity (or, color)

specular exponent (aka glossiness)

Lighting equations:

basics

Ambient factor:

assumption:

a bit of light reaches the object from every dir

and is bounced by it in every dir

Pro: simple to compute:

just an additive constant

the ambient light factor, a global

Con: terribly crude approx

(11)

Lighting equations:

basics

Used together:

final

ambient diffuse

(or Labertian)

specular

(or Phong)

+

+ =

Local lighting

Material parameters

(data modelling the «material»)

Illuminant

(data modelling lighting environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Illuminant

(data modelling the Lighting Environment)

(12)

types

Discrete

a finite set of individual light sources (plus a global ambient factor)

Densely sampled

environment maps:

textures sampling incoming light

Basis functions

a spherical function stored as spherical harmonics coefficients

Also used jointly!

Illumination environments:

discrete

a finite set of individual “light sources”…

few of them (usually 1-16)

each one sitting in a node of the scene-graph

each of a type:

point light sources

have: position

spot-lights

have: position,

orientation, wideness (angle)

directional light sources

have: orientation only

extra per light attributes:

color / intensity

fall-off function (with distance)

max range, and more

(13)

Illumination environments:

discrete

a finite set of “light sources”…

...plus, one global “ambient light” factor

models other minor light sources + bounces

light incoming from every direction at every position

multiplier of the ambient term of the lighting equation

examples:

in a overcast outdoor scene: high

(dim shadows, flat looking lighting:

every photographs’ favorite for portraits!)

in realistic outer space: zero

in any other scenes : something in between (e.g. sunny day, or torch lit cave)

Illumination environments:

discrete

Pros:

simple to position / reorient individual light sources

both at design phase, or dynamically (at game exec)

as they just sit in nodes of the scene-graph!

quite faithfully model of certain illuminants, e.g.

explosions (positional lights)

car lights (spot-lights lights)

sun direction (directional light)

relatively easy to compute (hard, soft) shadows for them

(see shadow-map, later)

Cons:

each discrete light requires extra processing … for each pixel!

this is why there’s a hard limit on their number!

this is why they are often given a (physically unjustified) radius of effect

the don’t model well:

area light sources (e.g. from back-lit clouds)

subtler illumination effects

environment to be seen mirrored in (e.g. metal) objects

Often good for the main illuminants of the scene!

(14)

densely sampled

A light intensity / color from each direction

Asset to store that:

“Environment map” texture

Illumination environments:

densely sampled

Also “sky-map” texture

when it’s only / predominantly the sky to be featured

doubles as textures for “sky boxes”

(15)

Illumination environments:

densely sampled

Environment map: (asset)

a texture with a texel t for each direction d

texel t stores the light coming from direction d

Q: how to find u,v position of t for a given d ?

i.e. how to parametrize (flatten) the unit sphere

Different answers are possible…

latitude/longitude format mirror sphere format cube-map format (ad hoc HW support!)

unit vector

Illumination environments:

densely sampled

Latitude/longitude format

θ φ

180 -180

90

-90

(16)

densely sampled

Environment map: (asset)

a texture with a texel t for each direction d

texel t stores the light coming from direction d

Pro:

realistic, complex, detailed, hi-freq, light environments

best result for mirroring (e.g. shiny metal, glass, water) materials

can be captured from reality

Con:

expensive

(storage cost, lighting computation cost)

not ideal for a few predominant light sources

hard for artist to control

hard for the engine to dynamically change

easy, for static environments only

Illumination environments:

with basis functions

Lighting environment:

a continuous function

Where 𝑓(𝑣) = amount of light coming from direction 𝑣

Store 𝑓 through basis functions

𝑓 𝑣 =𝑎 , 𝑓, 𝑣 +𝑎 , 𝑓, 𝑣 +𝑎 , 𝑓, 𝑣 +𝑎 , 𝑓, 𝑣 + ⋯ set of all unit vectors (i.e. surface of the unit sphere)

or R3if rgb colors

a few scalar values to be stored, in order to model 𝑓 fixed spherical “basis” functions (always the same ones)

(17)

Illumination environments:

with basis functions

𝑓

,

𝑣

𝑎

𝑏

Illumination environments:

with basis functions

Spherical Harmonics (SH) in brief:

store Illumination Env as a small number (1,4,9,16…) of scalar weightsof as many fixed

spherical basis functions.

Pros:

very compact

models well continuous function: smooth environm.

allows for efficient computation of in Lighting eq

Cons:

continuous functions ONLY

no hi-freq details, no hard lights

not much variations (unless very many coefficient used)

Often good for the background lighting!

(18)

Local lighting

Material parameters

(data modelling the «material»)

Illuminant

(data modelling the Lighting Environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Material parameters

(data modelling the «material»)

Terminology:

«material» has two meanings

Material Parameters

parameters modelling the local optical behavior of physical object

part of the arguments of the (local) lighting equation

Material Asset

a common abstraction used by game engines

consisting of

a set of textures (e.g. diffuse + specular + normal map)

a set of shaders (e.g. vertex + fragment)

a set of global parameters (e.g. global glossiness)

rendering settings (e.g. back-face culling ON/OFF) corresponds to the status of the rendering engines

(19)

Material Asset: remember how a mesh is rendered (hi-level)

Load…

make sure all data is stored in GPU RAM

Geometry + Attributes

Connectivity

Textures

Shaders

Material Glob. Param.

Rendering Settings

…and Fire!

send the command: “do it” !

THE MESH ASSET

THE MATERIALASSET

Material parameters

(20)

Which ones?

Q: which set of parameters is a «material»?

A: depends on the chosen lighting equation

remember: regardless of the answer, each parameter can be stored:

per Material Assets, as a global parameter, or

per Vertex of a Mesh, as attributes, or

per Texel of a texture sheet (for maximal freedom) the arguments of the lighting equation accounting for the physical substance

that the surface is made of meterial

=

Choice of material parameters:

Chapter 1 (’90s)

4 terms (aka “OpenGL material”, OBJ material)

The Lighting Equation is 4 simple terms:

Ambient + Diffuse + Specular (+ Emission)

The material is… multipliers for each term, therefore:

“Ambient” factor (RGB)

“Diffuse” factor (aka “Base Color”, aka “Albedo”)

“Specular” factor (RGB)

(plus one “Specular Exponent”, aka “glossiness”)

(a scalar param used in the formula for the specular term)

“Emission” factor (RGB)

(only for stuff emitting light – otherwise 0,0,0 )

usually,

separate multiplier for R, G and B see earlier

(21)

Choice of material parameters:

Chapter 1 (’90s)

not too expressive  Still used (sometimes).

MTL files (OBJ file format) is basically this.

Choice of material parameters:

Chapter 2 (’00s)

The Lighting Equation becomes more complex

in several different ways

It feeds on more and more parameters…

Such as: Fresnel factor, Anisotropy factors, Reflectivity – for environment maps, …

Authoring materials (task of the material artist):

increasingly complex, and ad-hoc task

Difficult to port one material ...

from one engine to another, from one game to another, from one asset to another

Difficult to guess right parameters for a given object

especially if it has to look good under widely different lighting conditions!

(22)

Material qualities: improving

indie 2006 indie 2010

Material qualities: improving

Triple A – 2015 (PBM!)

(23)

Choice of material parameters:

Chapter 3 (’10s)

Physically Based Materials (PBM)

an ongoing trend!

General characteristics and objectives:

increased intuitiveness:

provide Material Artist w. an higher-level material description

eases the Material Authoring task!

increased standardization:

makes materials more cross-engine / portable! (almost)

increased generality:

accommodates for most established, modern lighting eq. terms, and lighting environment description (e.g. env maps)

increased realism / quality:

more faithful, physically justified model of real-world materials

result: can even be captured from real-world samples!

result: good-looking results under widely different lighting env!

«Physically Based Materials»

(PBM)

Current popular choice of parameters:

Base color (rgb – or “diffuse”, same as old school)

Specularity (scalar)

“Metallicity”(scalar)

Roughness (scalar)

images: unreal engine 4

0.0 1.0

0.0 1.0

METAL=1METAL=0

(24)

(PBL)

A lighting model accepting, as input, a PBM

note: this can be achieved with different equations

Also, a lighting model taking fewer shortcuts than otherwise typical

e.g. using:

diffuse color: one texture

baked AO: another texture

instead of:

base color ×baked AO : one texture

Objectives: same as PBM (exp. under “realism”)

Warning: PBM & PBL are, basically, buzzwords 

Ambient Occlusion Texture: see later

Much wider expressiveness

(25)

Local lighting in brief

Material properties

(data modelling the «material»)

Illuminant

(data modelling the Lighting Environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

Reminder: normals

Per vertex attribute of meshes, or stored

in

normal

maps

(26)

(per vertex) Tangent directions

normal mapping (tangent space):

requires tangent dirs

«anisotropic»

BRDF:

requires tantent dir

Local lighting in brief

L O C A L L IG H T IN G

final R, G, B

( the lighting equation )

Material properties

(data modelling the «material»)

Illuminant

(data modelling the Lighting Environment)

Geometric data

(e.g. normal, tangent dirs,

pos viewer)

(27)

Lighting equation:

how

Computed in the fragment shader

most game engine support a subset as default ones

any custom one can be programmed in shaders!

Material + geometry parameters stored :

in textures(for highest-frequency variations inside 1 obj)

in vertex attributes (smooth variations inside 1 obj)

as material assetparameters (no variation for 1 obj)

for example, where are

diffuse color

specular color

normals

tangent dirs typically stored?

How to feed parameters to the lighting equation

Hard wired choice of the game engine

but sometimes, a complex set of choices in the hand of the dev

Specialized WYSIWYG game-tools not uncommon

E.g. in Unreal Engine 4:

(28)

Beyond local lighting

Local lighting = only 3 things count:

light emitter(s)

the infinitesimal part of surface hit by light, i.e.:

its local material

(i.e. how does it bounces light) (aka: the BRDF)

its local shape

observer position

Anything else is part of Global lighting

The rest of the scene also affects the results

Global effects are considerably HARDER

global variables / env textures

global variables interpolated from vertices sampled from textures

global variables

Global lighting:

two classes of approaches

Strategy 1: use local lighting, but feed it a position dependent light environment

precomputed ones, i.e. baked in preprocessing (once or, every so often)

Strategy 2: a few ad-hoc rendering techniques

basically, rendering algorithms that map well to existing HW pipeline

often, multi-pass techniques

see later for a summarized list

(to be used jointly)

(29)

Global lighting:

two classes of approaches

Strategy 1: use local lighting, but feed it a position dependent light environment

Let’s see three examples:

Precomputed Object-Space Ambient Occlusion

Virtual lights

Light probes

Position dependent light environment 1/3

Partial occlusion of the ambient light term

by some scalar factor in [0,1]:

known as the “Ambien Occlusion” value (AO)

For a point on the mesh, the AO value

depends on the parts of the mesh around that point

Can be precomputed! (for a given static object)

AO is, approximatively,

independent on position/orientation of object in the scene

because it is precomputed for the object in its own space, this is termed Object Space AO

Often, stored in a texture: the AO-map (or AO-texture)

as part of the preprocessing

note: this requires a 1:1 UV mapping

all standard modelling tool can bake it!

(30)

light environment 2/3

Virtual lights

additional discrete light sources (e.g. point lights) that model reflected (bounced) lights

(and not actual light emitters!)

Pro: it’s easy to create / reposition them dynamically (i.e. at rendering time)

Pro: the lighting equation itself is not affected

Con: remember the total number of lights is limited

Con: not very general

it’s only adequate in a small set of cases

Not very much used in games (more, in CGI movies)

Position dependent light environment 3/3

Light probes

A light probe is:

a (precomputed) lighting environment to be used around a given (xyz) position of the scene

it’s a continuous lighting environment

i.e. one incoming light from every possible incoming direction

Idea:

preprocessing: disseminate the scene with light probes

at rendering time, for a object currenly in (xyz), use an interpolation of near-by light probes

Widespread: standard in AAA games

(31)

Position dependent light environment 3/3

Light probes

mathematically, a light probe is:

a (continuous) function

from Omega (set of unit directions) to R (scalars), or RGB (colors)

problem: how to represent (store) them?

most common answer: Spherical Harmonics (SH)!

compact in RAM,

possible/quick to interpolate between N light probes

efficient in lighting computations

or: small (lower res) env maps

strong individual lights done with discrete light sources

Ad-hoc rendering techniques popular in games: a summary

Shadowing

shadow mapping

Screen Space Ambient Occlusion

Camera lens effects

Flares

limited Depth Of Field

Motion blur

High Dynamic Range

Non Photorealistic Rendering

contours

lighting quantization

Texture-for-geometry

Bumpmapping

Parallax mapping

SSAO

DoF

HDR

NPR with PCF

(32)

techniques in general

Many of the techniques above are “multiple passes”

Direct rendering (single-pass) – each frame :

The rendering pipeline fills a 2D buffer of pixels

The 2D buffer is sent to the screen

Multi-pass techniques:

1stpass (or, a few) : an internal 2D buffer(s) is filled, but kept in GPU ram, and never shown to the user

final pass: the internal buffer is used as one texture;

the resulting buffer is sent to the screen)

Screen-space techniques

one type of multi-pass techniques

in the final pass, only one big textured quad is rendered, which covers the entire screen

Multipass rendering techniques in general

Internal buffers

2D array of pixels / texels

the output of one rendering pass (see pipeline, above)

the texture of a subsequent pass (note: read only in that pass!)

each pixel / texel is

rgb values (color buffer / map)

they can be: depth values (depth buffer / map)

(or other things…)

(33)

Shadow mapping

Shadow mapping

(34)

in a nutshell

Two passes.

1st rendering: camera in light position

produces: depth buffer

called:

the shadowmap

2nd renedring:

camera in final position

for each fragment

access

the shadowmap once to determine

if fragment is reached by light or not

Shadow mapping in a nutshell

OCCHIO LUCE

SHADOW MAP

final SCREEN BUFFER

(35)

Shadow Mapping:

issues

Rendering shadomap:

redo every time object move

can bake once and for all for static objects

(jet another reason to tag them)

Shadowmap resolution:

it matters! aliasing effects

remedies: PCF, multi-res shadow-map

optional topics (no exam)

Screen Space AO

(36)

Screen Space AO

OFF

Screen Space AO

ON

(37)

Screen Space AO in a nutshell

First pass: standard rendering

produces: rgb image

produces: depth image

Second pass:

screen space technique

for each pixel, look at depth VS its neighbors:

neighbors in front?

difficult to reach pixel: darken ambient

neighbors behind?

pixel exposed to ambient light: keep it lit

(limited)

Depth of Field

depth in focus range:

sharp depth

out of focus range:

blurred

(38)

in a nutshell

First pass: standard rendering

rgb image

depth image

Second pass:

screen space technique:

pixel inside of focus range? keep

pixel outside of focus range? blur

(blur = average with neighbors pixels kernel size ~= amount of blur)

HDR - High Dynamic Range

(limited Dynamic Range)

(39)

HDR - High Dynamic Range in a nutshell

First pass: normal rendering, BUT use lighting / materials with HDR

pixel values not in [0..1]

e.g. sun emits light with = RGB [500,500,500]:

>1 = “overexposed”! “whiter than white”

Second pass:

screen space technique:

blur image, then clamp in [0,1]

i.e.: overexposed pixels lighten neighbors

i.e.: they will be max white (1,1,1), and their light bleeds into neighbors

Parallax Mapping

Normal map only

(40)

Parallax Mapping

Normal map + Parallax map

Parallax mapping:

in a nutshell

Texture-for-geometry technique

Texture used:

displacement maps

color / rgb map

(41)

Motion Blur

Non-PhotoRealistic Rendering (NPR)

Any rendering technique not aimed at realism

Instead, the objective can be:

imitating a given style (imitative rendering), such as:

cartoons (“toon shading”)  most popular!

pen-and-ink drawings

pencil sketches

pixel art  popular in nostalgic retro games (niche)

manga, or, western comics  not uncommon

pastels, oil paintings, crayons …

clarity/readability (illustrative rendering)

usually not for games

(42)

Toon shading / Cel Shading

Toon shading / Cel Shading

(43)

Toon shading / Cell Shading in a nutshell

Simulating “toons”

Two effects:

add contour lines

lines appearing at discontinuities of:

1. depth, 2. normals, 3. materials

quantize lighting:

e.g. 2 or 3 tones: light, medium, dark instead of continuous

simple variation of lighting equation

NPR rendering:

e.g.: simulated pixel art

img by Howard Day (2015)

(44)

simulated pixel art

img by Dukope

NPR rendering:

simulated pixel art

Riferimenti

Documenti correlati

Rosenzweig KE, Youmell MB, Palayoor ST, Price BD, Radiosensitization of human tumor cells by the phosphatidylinositol-3-kinase inhibitors wortmannin and LY294002 correlates

software tools, software eng., AI prog, CG prog, math, game design...). Mathematics for 3D

Marco Tarini - Video Game Dev Univ... Marco Tarini - Video Game Dev

Video Game Dev - Univ Insubria 2017/2018..

 If different parts of mesh associated to different textures: decompose the object in sub-mesh.. Texture maps assets and

Marco Tarini - 2014 1 Game Engines+. Master Game Dev

– come sappiamo, cumulabile con qualunque altra trasf..

Compila o F9: Esegue esclusivamente le azioni di compilazione e se non ci sono errori sintattici anche quelli di linkaggio del progetto selezionato, senza effettuare