Author: Romain Vergne (website)
Please cite my name and add a link to my web page if you use this course

Image synthesis and OpenGL: appearance / details

Quick links to:
  1. The lighting equation
  2. Storing color and material information
  3. Storing geometry details
  4. Storing material properties
  5. Storing visibility information
  6. Storing environments
  7. Sources

Reminder: the lighting equation

Rendering equation

\[
L(\mathbf{p} \rightarrow \mathbf{e}) =
 L_e(\mathbf{p} \rightarrow \mathbf{e}) +
 \int_{\Omega_\mathbf{n}}
  \rho(\mathbf{p}, \mathbf{e}, \pmb{\ell})
  (\mathbf{n}\cdot\pmb{\ell}) \
   L(\mathbf{p} \leftarrow \pmb{\ell}) \
  d\pmb{\ell}
\]

Approximation

\[
L(\mathbf{p} \rightarrow \mathbf{e}) =
\rho_a L_a +
 \sum_{k}
  \rho(\mathbf{p}, \mathbf{e}, \pmb{\ell}_k) \
  (\mathbf{n}\cdot\pmb{\ell}_k) \
   L(\mathbf{p} \leftarrow \pmb{\ell}_k)
\]


Which elements could be described by some textures?


Storing color and material information

Color mapping


   
Color (albedo) map                                       specular map

rendering on a sphere

Example of Team Fortress 2





(A) Albedo
(B) Ambient + Diffuse lighting
(C) Rim + Specular lighting
(D) Final image

Final Image \( D = A \times B + C \)

More details in this paper

Storing geometry details

normal mapping

The texture stores the normal of the surface at every point.


Normal map

Result without normal mapping

Result with normal mapping

The problem

Normals are stored in the tangent space: if a normal inside the texture has the coordinates \( (0,0,1) \), it basically means that we do not want to change the original surface normal.
We thus need a transformation matrix that converts coordinates from the tangent space to world or camera space (depending on where we apply the lighting computations).
This matrix is directly given by the normal, the tangent and the binormal coordinates called TBN matrix.



At each vertex, we thus need the normal and the tangent to be able to convert normals and perform the lighting computation

On the CPU side:
On the GPU side:
\[ \begin{pmatrix}
T_x & B_x & N_x\\
T_y & B_y & N_y\\
T_z & B_z & N_z
\end{pmatrix} \]


Bump mapping

The same kind of effect may be obtained using a heightfield as input instead of a normal map.


In this case, the normal (defined in tangent space) needs to be computed inside the GPU:
Advantages / drawbacks of bump/normal mapping:

  parallax/ relief mapping



Basic idea:

Without parallax mapping

With parallax mapping

displacement mapping

Idea: displace vertices towards their normals using a heightmap


Height map

Normal mapping

Displacement mapping






Storing material properties

Bidirectional Texture Functions

Remind that material properties are computed according to a view direction and a light direction.
The idea is simple:
   





Storing visibility information

Ambient occlusion


Coarse shadows approximation

\[ A=\frac{1}{\pi} \int V(\mathbf{\omega}) (\mathbf{n} \cdot \mathbf{\omega}) d \mathbf{\omega}\]




Ambient occlusion map

Displacement without AO

Displacement with AO



Storing environments

Cube maps





But, many different possible representations



Cube map
Latitude-longitude map
light probe


Useful for real-time rendering


Reflection vector

Refraction vector

Prefiltered environment maps

Irradiance environment map

Radiance env map

Irradiance env map








Sources


PREVIOUS: EXERCICE05
NEXT: EXERCICE06