GPU Shader Tutorial Logo
GPU Shader Tutorial
This tutorial is currently a work in progress. Content may be added, updated, or removed at any time.

Shader Intermediates - Color Mapping

So far we've passed direct color information of vertices to the fragment shader, which was then interpolated for each fragment and then set as the pixel color.

As learnt in the fragment shader basics, color information can only be mapped to the vertices of an object, with color values of parts of the object between the vertices being interpolated.

However, as we learnt in the mapping chapter, we can use a texture to add much more detail to an object through the process of UV mapping.

We can use the process of mapping to map colors of a fragment from the color data stored in a texture/color map. This process is called color mapping.

Color mapping is also referred to as texturing due to the fact that textures only hold color data, and are primarily used to color fragments in an image.

Let's look at an example of color mapping where an image is used as a texture to color the faces of a cube.

An example - A cube

Cannot run WebGL examples (not supported)
Cube:
    Vertices:
        Vertex 1: { x: -1.000, y: -1.000, z: 1.000 }
        Vertex 2: { x: -1.000, y: 1.000, z: 1.000 }
        Vertex 3: { x: 1.000, y: -1.000, z: 1.000 }
        Vertex 4: { x: 1.000, y: 1.000, z: 1.000 }
        Vertex 5: { x: 1.000, y: -1.000, z: -1.000 }
        Vertex 6: { x: 1.000, y: 1.000, z: -1.000 }
        Vertex 7: { x: -1.000, y: -1.000, z: -1.000 }
        Vertex 8: { x: -1.000, y: 1.000, z: -1.000 }
    Face UV:
        Vertex 1: { u: 0.000, v: 0.000 }
        Vertex 2: { u: 0.000, v: 1.000 }
        Vertex 3: { u: 1.000, v: 0.000 }
        Vertex 4: { u: 0.000, v: 1.000 }

How it works

The following texture is used to color the each face of the rendered cube:

Cube Face Texture

The vertices of each face are mapped to the corners of the texture. You can look at the cube details below the rendered image to see the UV coordinates of the vertices of a single face.

Note: For OpenGL/WebGL, the origin for UV coordinates is the lower-left corner of an image. For DirectX, the origin for UV coordinates is the upper-left corner of an image. When translating shader code between these languages, take care of the Y-axis values of the UV coordinates.

These UV coordinates are passed as part of the vertex data to the GPU. When these coordinates are passed to the fragment shader through the vertex shader, the GPU interpolates the UV coordinates of the fragments.

For example, a fragment in the center of the face is equi-distant from all four vertices of the face. Since the vertices are mapped to the corners of the texture, the fragment will receive an interpolated UV coordinate at the center of the texture.

Using these interpolated UV coordinates, the color of the texture at that point can be read by the fragment shader, which will represent the final color of that fragment, since the UV coordinate represents the location of the fragment within the texture.

Vertex Shader Code:

1
2
3
4
5
6
7
8
9
10
11
attribute vec4 vertexPosition;
attribute vec2 vertexUv;

uniform mat4 mvpMatrix;

varying highp vec2 uv;

void main() {
  gl_Position = mvpMatrix * vertexPosition;
  uv = vertexUv;
}

Fragment Shader Code:

1
2
3
4
5
6
7
varying highp vec2 uv;

uniform sampler2D colorTextureSampler;

void main() {
  gl_FragColor = texture2D(colorTextureSampler, uv);
}

In the vertex shader code above, the UV coordinates of the vertex is provided through the vertexUv attribute, which is then passed to the fragment shader through uv, allowing the GPU to interpolate the UV coordinates for each fragment.

In the fragment shader, the pixel color value of the texture at the given UV coordinates is retrieved through the 2D sampler colorTextureSampler, which is then the final color assigned to the fragment.

colorTextureSampler is defined as sampler2D because the texture is a 2D image which is being sampled for color values at specific coordinates.

Another example - A (mostly) pulsing cube

Cannot run WebGL examples (not supported)
Cube:
    Vertices:
        Vertex 1: { x: -1.000, y: -1.000, z: 1.000 }
        Vertex 2: { x: -1.000, y: 1.000, z: 1.000 }
        Vertex 3: { x: 1.000, y: -1.000, z: 1.000 }
        Vertex 4: { x: 1.000, y: 1.000, z: 1.000 }
        Vertex 5: { x: 1.000, y: -1.000, z: -1.000 }
        Vertex 6: { x: 1.000, y: 1.000, z: -1.000 }
        Vertex 7: { x: -1.000, y: -1.000, z: -1.000 }
        Vertex 8: { x: -1.000, y: 1.000, z: -1.000 }
    Face UV:
        Vertex 1: { u: 0.000, v: 0.000 }
        Vertex 2: { u: 0.000, v: 1.000 }
        Vertex 3: { u: 1.000, v: 0.000 }
        Vertex 4: { u: 0.000, v: 1.000 }
Time: 2171.6606739999843

How it works

Similar to how the pulsing triangle was made in the last example of the fragment shader basics, a colorShift value is subtracted from the color values retrieved from the texture (except for alpha), and then set as the final color of the fragment.

One unique change is the white edges of the cube, which do not pulse and always stay constant.

Fragment Shader Code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
varying highp vec2 uv;

uniform highp float time;
uniform sampler2D colorTextureSampler;

highp float getColorShiftFactor(highp vec3 color) {
  return clamp(ceil(3.0 - (color.r + color.g + color.b)), 0.0, 1.0);
}

void main() {
  highp float colorShift = cos(time / 500.0);
  highp vec4 textureColor = texture2D(colorTextureSampler, uv);
  highp float finalColorShift = getColorShiftFactor(textureColor.rgb) * colorShift;
  gl_FragColor = vec4(clamp(textureColor.rgb - finalColorShift, 0.0, 1.0), textureColor.a);
}

In the code, a new function getColorShiftFactor accepts an RGB color variable, and using a formula, decides whether the color shift should be factored by 0 or 1.

If 0 is returned, the final color shift value is nullified, not affecting the value of the texture color when it is set to the final fragment color value.

If 1 is returned, the final color shift value is in full effect, affecting the texture color value as it normally would when setting the final fragment color value.

The color shift factor formula works as follows:

  • Combine the values of the red, green, and blue components of the texture color.
  • Subtract this value from 3.0
  • Ceil the difference at or above it (ex: it would change the number 2.1 to 3).
  • Clamp the resultant value so that it is within the range 0 - 1

When a color is white, it's RGB components would all be at their highest value, which is 1. As a result, their sum would be 3. When this sum is subtracted from the value 3, any color value that is not white would result in a difference greater than 0.

This will result in the ceiling value of any color that is not white be least 1 or higher, which is then clamped to 1.

As a result, this equation ensures that all texture color values that are white have a color shift factor of 0, and the rest have a value of 1.

Additional Notes

If you're working with OpenGL/WebGL, there is one thing you will need to remember when working with textures.

Typically when images are read, their starting coordinates are at the top-left of the image. This means that the origin coordinates (0, 0, 0) represents the top-left most pixel of the image.

However, in OpenGL/WebGL the starting coordinate is instead the bottom-left of the image. This means that if you try to load an image the same way in OpenGL as in DirectX, you will find that the image is flipped vertically when sampling and rendering it in your shader.

As a result, you have two possible options:

  • Flip the image vertically when loading it in OpenGL to keep things consistent in the shader.
  • Invert the Y-axis of your UV coordinates when sampling from the image texture.

Since our tutorial uses WebGL, we need to use one of these options to prevent textures from appearing inverted. We've gone with the first option to keep the shader logic more consistent. If you're using OpenGL/WebGL, you'll need to keep this in mind as well.

Summary

  • Through the process of UV mapping, we can define the color of each fragment of an object through the use of a texture.
  • Each vertex is assigned a UV coordinate on the texture, which is then interpolated by the GPU for each fragment.
  • The fragment shader can then read the color value of the texture at the interpolated UV coordinate for the fragment to determine and set the final color of that fragment.