Loading [MathJax]/jax/output/HTML-CSS/jax.js
Author: Romain Vergne (
website)
Please cite my name and add a link to my web page if you use this course
Image synthesis and OpenGL: lighting
Quick links to:
- The
problem
- Color
- The
rendering equation
- Local
lighting
- Types
of lights
- Types
of reflections
- Computing
lighting on the GPU
- Sources
The problem

|
The illumination on a given point of the surface depends on:
- primary light sources
- secondary light sources
- all the objects in the scene
- an illuminated object becomes a light source
- indirect lighting
Cornel box (1984)
|
We will be mainly interested in direct
lighting and local illumination in this course.

|
The illumination on a given point of the surface depends on:
- The viewpoint
- The surface properties
- reflexion
- absorption
- diffusion
- The light properties
- direction
- wavelenght
- energy
Involve
a lot of dimensions
|
Color
Physically, visible light is an electromagnetic wave

|

|
Physiologically,
our eyes use 3 types of sensors to perceive light colors |
In computer science
 |

|
Additive models
(lighting)
|
Soustractive
models
(painting)
|
- RGB(A) for
Red, Green, Blue (Alpha) is the most well known additive model
- Easy to use in graphics applications
- Difficult to ensure the same color on different screen...
(gamut)
- CIE
1931 XYZ color space
- The tristimulus values can be conceptualized as amounts of
three primary colors in a trichromatic additive color model.
- The CIE (Comission Internationale
de l'éclairage) was the first to mathematically define a
trichromatic system to represent perceived colors.
- It is based on the standard (colorimetric) observer and
thus allows to uniquely represent colors in a 3D space.

Color matching functions: spectral
sensitivity
curves of 3 linear light detectors
|
If I(λ) is the spectral
distribution of a color, then:
X=∫800400I(λ)ˉx(λ)dλ
Y=∫800400I(λ)ˉy(λ)dλ
Z=∫800400I(λ)ˉz(λ)dλ
|
- CIE Yxy
color space
- Normalize the XYZ color space and decompose it into Luminance (Y)
and chrominance (xy)
- CIE
Lab color space
- Perceptually uniform: distance between 2 colors = perceived distance
between these colors
- Used in most graphics applications
- L = luminance
- a = chrominance (red-green)
- b = chrominance (blue-yellow)
- HSV (Hue,
Saturation, Value) color space
- In most image processing software: Photoshop, Gimp, etc.
- Intuitive for designers
- CMY(K),
for Cyan, Magenta, Yellow (and Key - Black)
- Subtractive model
- Mainly used for printing
- ETC....: Adobe RGB, sRGB, CIELuv, CIEUvw, YIQ (NTSC), YUV (PAL), HSL,
....

|
Still, the perception of colors is
not well understood and the perfect color space does not exist!
(example of color contrast)
try this excellent
demo
|
Coding colors
- Binary representation: 0 or 1
- 8 bits: 0 to 255 grey levels (monochromatic)
- 24 bits: 8 bits per channel (polychromatic)
- 256 values per channel (usually RGB)
- 256*256*256 = 16 777 216 colors
- limited: 8 orders of magnitude between the sun and stars
- HDR (High Dynamic Range) images: float or double precision per channel
- 24 bits per channel = 2569 colors
- Can be created using multiple LDR (Low Dynamic Range) images
- Needs to be tonemapped to be displayed on a screen
The rendering
equation [Kajiya
1986]
L(p→e)=Le(p→e)+∫Ωnρ(p,e,ℓℓ)(n⋅ℓℓ) L(p←ℓℓ) dℓℓ
- p is a point on the surface
- e is the view direction
- n is the normal of the surface at point p
- ℓℓ is the direction of a light in the hemisphere Ωn
- L(p→e):
- outgoing radiance (in Wm−2sr−1)
- how much energy is arriving to the eye / camera
- Le(p→e):
- emitted radiance
- usually equal to 0 for object surfaces (they do not create energy)
- L(p←ℓℓ):
- incoming radiance
- incident illumination leaving the light ℓℓ and
arriving at the point p of the surface
- (n⋅ℓℓ):
- the orientation of the surface
- dot product between n and ℓℓ
- ρ(p,e,ℓℓ):
- material properties / BRDF (Bidirectional Reflectance Distribution
Function)
- how much energy the surface reflects in the viewing direction e given the incident light ℓℓ
Local lighting
General equation
Empirical model for
computing the outgoing radiance
L(p→e)=ρaLa+∑kρ(p,e,ℓℓk) (n⋅ℓℓk) L(p←ℓℓk)
- L(p→e): outgoing radiance / light
energy / color
- ρaLa: ambient lighting (approximate indirect lighting)
- ∑k⋯: contribution of each light ℓℓk
- ρ(p,e,ℓℓk): BRDF - how the light
ℓℓk is reflected on top of the surface
- (n⋅ℓℓk): surface orientation (according to
light ℓℓk )
- L(p←ℓℓk): incoming radiance for
light ℓℓk
The simplest model: assign a color at each point of the surface (albedo)
L(p→e)=color
- Reflecting power of the surface
- Independent of the view direction
- Independent of the light direction
- Too simple:
- adapted for static scenes (photos) but not for dynamic ones (image
synthesis)
- dynamic viewpoint?
- dynamic lighting variations?
The second simplest model: consider a single dynamic light
L(p→e)=ρ(p,e,ℓℓ) (n⋅ℓℓ) L(p←ℓℓ)
- Which types of lights?
- Which types of surface reflections?
Types of lights
Infinitesimal
lights

|

|

|
Directionnal light
- Distant sources (sun)
- Environment maps (video-games)
L(p←ℓℓ)=L
|
Point light
- position in space pℓℓ
- near, small sources
L(p←ℓℓ)=L/r2
with r=||p−pℓℓ|| and ℓℓ=p−pℓℓr
|
Spot light
- position in space
- near, small sources
- defined in a cone
- direction sℓℓ
- exponent e
- a cutoff c
L(p←ℓℓ)=(sℓℓ⋅ℓℓ)eLr2
if (sℓℓ⋅ℓℓ)<c and 0 otherwise
|
Area
lights

|

|
- Defined on volumes / surfaces
- Soft shadows
- Expensive!
|
Types of reflections
Lets consider a (white) directional light ( L=cst=(1,1,1) ). The model
is equal to:
L(p→e)=ρ(p,e,ℓℓ) (n⋅ℓℓ)
The lambertian model

|

|
Lambertian
surface
|
ρ(p,e,ℓℓ)=cst
- Light is diffused in every direction
- Independant on the point of view
|
L(p→e)=ρd (n⋅ℓℓ)Ld
|
L(p→e)=∑kρkd (n⋅ℓℓ)Lkd |
One light:
- ρd= constant diffuse material color
- Ld= constant diffuse light color
|
Multiple light
- sum of contribution of each light
- may have different color coeficients
|
The mirror and transparent models

|

|
Mirror surface
|
ρ(p,e,ℓℓ)=cst if (r⋅e=1), 0 otherwise
- Reflected light vector r=2n(n⋅ℓℓ)−ℓℓ
- Dependant on the point of view
- Usefull for environment maps
- compute the reflected view vector
- r=2n(n⋅e)−e
- use it to fetch a color in the map
|

|

|
Transparent
surface
|
ρ(p,e,ℓℓ)=cst if (r⋅e=1), 0 otherwise
- Refracted
light vector r=eℓℓ−(e(n⋅ℓℓ)+√1−e2(1−(n⋅ℓℓ)2))n
- Dependant on the point of view
- Same principle for environment maps
|
L(p→e)=ρs(p,e,ℓℓ) (n⋅ℓℓ)Ls
|
L(p→e)=∑kρks(p,e,ℓℓ) (n⋅ℓℓ)Lks |
One light:
- ρs= constant specular material color
- Ls= constant specular light color
|
Multiple light
- sum of contribution of each light
- may have different color coeficients
|
Glossy models

|

|
Glossy surface |
ρ(p,e,ℓℓ)=ρd(p,e,ℓℓ)+ρs(p,e,ℓℓ)
- Usually expressed as a sum of diffuse and specular terms
- ρd=cst (as before)
- ρs=(r⋅e)e (for instance)
|
The Phong model
ρs=(r⋅e)e
The general formulation of the Phong model is given by a weighted sum of an
ambient, diffuse and specular term:
L(p→e)=ρaLa+∑kρkdLkd(n⋅ℓℓ)+ρksLks(r⋅e)e
where
- ρa, ρkd and ρks are the
material colors for the ambient, diffuse and specular term,
respectively.
- La, Lkd and Lks are the light colors for the
ambient, diffuse and specular term, respectively.
The Blinn-Phong Model
Specular term replaced by ρs=(h⋅n)e , with h=ℓℓ+e||ℓℓ+e||
Anisotropy effect
Specular term replaced by ρs=(h⋅n)nucos2ϕ+nvsin2ϕ
Fresnel effect
Obtained using Schlick approximation:
F=Rs+(1−Rs)(1−e⋅h)5
Other effects?

|

|

|
Diffraction
|
Non-realistic
effects
|
Varying material
properties
|
- Plenty of BRDFs (they are specialized for certain kinds of materials)
- Lambertian
- Phong
- Blinn-Phong
- Torrance-Sparrow
- Cook-Terrance
- Ward
- Oren-Nayar
- Ashikhmin-Shirley
- Lafortune
- etc...
- And also SVBRDF (for spatially varying BRDF)
- material parameters change over the surface (usually using textures
- And also BSSRDF (for Bidirectionnal Subsurface Scatering Distribution
Function)
- specific for translucent materials
- More generally: BXDF for Bidirectionnal X Distribution Function
How to compute
lighting on the GPU?
Flat shading

|
Compute lighting per face
- In the vertex shader
- Normals are constant for the vertices of each triangle
- Produce shading discontinuities
|
Gouraud Shading
|
Compute lighting per vertex
- In the vertex shader
- Normals are different for each vertex
- Color is computed and interpolated during rasterization
- Quality / result depends on tesselation
|
Question: what is the color at point
M?
|
Phong Shading (different from the Phong
model)

|
Compute lighting per fragment
- Normals are interpolated during rasterization (vertex to
fragment)
- Normals are re-normalized
- Color is computed
|

|
Sources