Brad Woods Digital Garden

Notes / JavaScript / three.js / Shaders

The Warhammer 40k Adeptus Mechanicus symbol

Table of contents

    A flat surface wireframe.

    Shaders

    Planted: 

    Tended: 

    Status: sprout

    Hits: 4

    Intended Audience: Creative coders, front-end developers with basic knowledge of three.js

    An introduction to WebGL shaders. What they are, how they work, why they are valuable and how to use them in three.js.

    Mesh

    In three.js, objects like a cubes, spheres or planes are called meshes. A mesh is made of geometry, which defines the shape, and a material, which defines how the surface looks. The shape is a collection of vertices and triangles, the fundamental primitives processed by the GPU. For example, a plane geometry is subdivided into segments, where each segment consists of two triangles. Material defines how the geometry is rendered — including color, lighting, and surface detail. These are implemented using a shader.

    /index.js

    // <planeWidth>, <planeHeight>, <widthSegmentCount>, <heightSegmentCount>
    const geometry = new THREE.PlaneGeometry(9, 9, 1, 1)
    const material = new THREE.MeshBasicMaterial({
    wireframe: true,
    })
    const mesh = new THREE.Mesh(geometry, material)
    Triangle with edges and vertices labeled
    A line with labels: JavaScript -> Vertex Shader -> Fragment Shader -> Pixels

    Pipeline

    The pipeline that transforms three.js code to pixels:

    1. 1. three.js JavaScript code is executed on the CPU
    2. 2. Shader code is executed on the GPU
    3. 3. Pixels are rendered on the screen

    A shader has two parts, a vertex and fragment shader. They work like this:

    1. 1. Each vertex runs through the vertex shader to get its viewport space position.
    2. 2. Primitives (like vertices and triangles) are converted into fragments — data corresponding to a potential pixel.
    3. 3. Each fragment runs through the fragment shader to get its color.

    Vertex shader

    three.js materials, like MeshBasicMaterial, produce a shader with preset values. However, we can define our own shaders using a speical material — ShaderMaterial — for more control and creativity.

    /index.js

    The vertex shader is executed for each vertex in the geometry. It's purpose is to set gl_Position — the vertex's position in clip space.

    • three.js passes several variables to the shader, including position, modelViewMatrix, and projectionMatrix.
    • position represents the vertex coordinates in the mesh's local space.
    • The above calcuation transforms the vertex from: local space → world space → camera space → clip space.
      • position contains the vertex coordinates relative to the object's own origin.
      • multiply by modelViewMatrix transforms local coordinates to world coordinates — coordinates relative to the scene's coordinate system.
      • multiply by modelViewMatrix also transforms world space to camera space — space relative to the camera's position and orientation.
      • multiply by projectionMatrix converts camera space to clip space - used by the GPU to determine what is visible on screen.

    GLSL

    Shaders are written in a C-like language called GLSL (OpenGL Shading Language). A data type commonly used is vectors — similar to JavaScript arrays.

    • vec2 myVec = vec2(1.0, 0.5) is like const myVec = [1.0, 0.5], vec3 myVec = vec3(1.0, 0.5, 1.0) is like const myVec = [1.0, 0.5, 1.0], ...
    • A value can be accessed using myVec.x or myVec.y which is like myVec[0] or myVec[1]
    • A shorthand for creating a vector with the same values is vec2 myVec = vec2(0.0), which is the same as vec2 myVec = vec2(0.0, 0.0)

    UVs

    UV is a 2D coordinate system used for the surface of the geometry. Similar to the XY coordinate system except all values are normalized (between 0 and 1). In the vertex shader, the UV coords of the current vertex are passed in by three.js.

    /index.js

    Fragment shader

    The fragment shader is executed for each fragment. It's purpose is to set gl_FragColor — the fragment's color. It accepts a normalized RGBA value.

    /index.js

    three.js sends UV coords to the vertex shader, but not the fragment shader. Therefore, to access the current fragment's UV coords, they need to be sent from the vertex to the fragment shader. Variable labeled vUv by convention. See shaders 102 - sending data for more details.

    /index.js

    const material = new THREE.ShaderMaterial({
    vertexShader: `
    varying vec2 vUv;
    void main() {
    ...
    vUv = uv;
    }
    `,
    fragmentShader: `
    varying vec2 vUv;
    void main() {
    ...
    });

    Strength / weaknesses

    In addition to providing low-level control, shaders are also performant. JavaScript runs on a CPU thread using sequential processing. Capable of doing one task at a time, like rendering one pixel at a time. Shader code runs on the GPU using parallel processing. Capable of doing many tasks at once, like rendering all pixels at once. The GPU also has hardware accelerated angle, trigonometric and exponential functions — functions used to create shader effects.

    Their weakness is code complexity. Shaders are written in native WebGL, which is low-level and verbose. Also, the only output is color. We don't have access to tools like console.log for debugging.

    Sandbox

    Feedback

    Have any feedback about this note or just want to comment on the state of the economy?