Tutorial: Cocos Shader Series - Basic Introduction

Cocos Shader Series - Basic Introduction

This is the start of a series built by one of our star Cocos developers, shenyedebaifang. We’ll be updating this as he continues his series. We are thankful and excited about the in-depth work done and hope it helps you.

Series Chapters

  1. Basic Introduction
  2. Build a Triangle
  3. Draw More Things
  4. Add Some Texture
  5. Coming soon
  6. Coming soon

This post is part of a series of tutorials on the introduction of Cocos Shader, aiming to lead some students to get started in Shader writing and lower the barrier to writing Cocos Shader. Before starting, I will introduce some Shader basics used in most graphics engines and uncover the mystery of shaders. The rendering engine used in this explanation is WebGL. Let’s first understand some rendering pipeline knowledge.

The difference between CPU and GPU

Before introducing WebGL, let’s first understand what it uses, that is, CPU and GPU. Both CPU and GPU are processing units, but their structures are different. Visually speaking, the CPU is a bit like a large transmission pipeline, and the tasks waiting to be processed can only pass through in turn, so the speed of the CPU processing tasks depends on the time to process a single task. Also, because the internal structure of the CPU is extremely complex and can handle a large amount of data and logical judgments, it is sufficient to handle some large tasks. But processing images on a CPU is not good. This is because the logic of processing images is usually not complicated, but because an image is composed of thousands of pixels, the processing of each pixel is a task. If the CPU processes it, it is simply overkill. Therefore, a GPU is required. The GPU is composed of a large number of small processing units, and the processing power is not as powerful as the CPU, but it is superior in number and can be processed in parallel.

Rendering pipeline

In the rendering process, cooperation between the CPU and GPU is required. The CPU, like a delivery truck, constantly throws the data to be processed to the GPU. The GPU factory mobilizes a computing unit like many workers to process the data and assemble the product-the image.

What is WebGL?

WebGL is a 3D drawing standard. Its essence is JavaScript to operate the OpenGL interface, so WebGL is a layer of encapsulation based on OpenGL, and the underlying essence is still OpenGL. Use WebGL to draw points, lines, and triangles based on your code. Any complex scene can be realized by combining points, lines, and triangles. WebGL runs on the GPU, so you need to use a program that can run on the GPU. Such programs need to be provided in pairs. Each pair of methods contains a vertex shader and a fragment shader written in the GLSL language (GL Shading Language). Each pair is called a program (coloring program).

In WebGL, everything is in 3D space, and the final output is often 2D pixels such as screens or windows. Therefore, most of the work at the bottom of the rendering engine converts 3D coordinates into 2D pixels that adapt to the screen. WebGL’s graphics rendering pipeline handles the process of converting 3D coordinates to 2D. Its primary transmission process is divided into two steps:

  1. Convert 3D coordinates to 2D coordinates

  2. Convert 2D coordinates into actual colored pixels

These two processes are divided into several stages, and each stage takes the output of the previous stage as input.

As you can see in the above figure, the graphics rendering pipeline consists of multiple stages. Each stage handles its own part of the responsibility in converting vertex data to final pixels. Next, the simple ones will reduce the functions of each stage of the pipeline.

Vertex data: Vertex data is used to provide processed data for subsequent vertex shaders and other stages and is the main data source for the rendering pipeline. The data sent to the rendering pipeline includes vertex attributes such as vertex coordinates, texture coordinates, vertex normals, and vertex colors. WebGL transmits corresponding primitive information according to rendering instructions (common primitives include points, lines, and surfaces).

Vertex shader: The main function of the vertex shader is coordinate conversion. Take a single vertex as input, and transform the vertex from local coordinates to clipping coordinates. Its job is to convert the 3D coordinates operated in the game into other 3D coordinates.

Primitive assembly: The primitive assembly stage takes all the vertices output by the vertex shader as input and assembles all the points into the shape of the specified primitive according to the specified instruction (point, line, or surface). For example: When two vertices are provided, whether to connect the two vertices into a line segment and whether to connect multiple line segments.

Rasterization: Using vertices and assembly to represent rectangles on the screen is the process of rasterization. Traverse all the pixels, and determine in turn whether they fall into the assembled graphics. If they are in the graphics, perform the next step (coloring) on the pixels. It also interpolates the position of the non-vertex to give each pixel other information.

Fragment shader: The main purpose of the fragment shader is to colorize the fragments in the graphics. This is also the stage where advanced effects are generated. Usually, the fragment shader contains 3D scene data (such as lighting, shadows, light color, etc.). These data can be used to calculate the color of the final pixel.

Test and blending: After all the corresponding color values are determined, the final object will be passed to the final stage, called the Alpha test and blending stage. This stage detects the corresponding depth of the fragment, uses them to determine whether the pixel is in front of or behind other objects, and decides whether it should be discarded. At this stage, the transparency alpha value is also checked, and the object is blended. Therefore, even if the output color of one pixel is calculated in the fragment shader, the final pixel color may be completely different when rendering multiple triangles.

As can be seen from the above, the graphics rendering pipeline is very complex and contains many configurations. Next, we practice step by step to deepen our understanding of this area.

GLSL language basics

GLSL is tailored for graphics calculations, and it contains some features for vector and matrix operations. A shader usually contains input and output variables, uniform and main functions. The entrance of each shader is the main function, in which all input variables are processed, and the results are output to the output variables.

A common vertex/fragment shader is as follows:

// vertex shader
// Input attribute variable type vec4 variable name a_position

attribute vec4 a_position;
attribute vec2 a_uv;
attribute vec4 a_color;

// Input and output attributes
varying vec4 v_color;
varying vec2 v_uv;

// Every shader has a main function, which is the interface to WebGL
void main() {
    // Basic assignment statement
    v_color = a_color;
    v_uv = a_uv;

    // Built-in variable gl_Position
    gl_Position = a_position;

}

// Fragment shader
// lowp precision
varying lowp vec4 v_color;
varying highp vec2 v_uv;
uniform sampler2D mainTexture;

void main(void) {
    vec4 o = texture2D(mainTexture, v_uv);
    o *= v_color
    gl_FragColor = o;
}

Here are some variables, modifiers, and common usages. For more usage, please refer to the GLSL Detailed Explanation (Basics) below. The content inside is still very detailed. Of course, students with conditions can directly go to the GLSL official website document Check.

1. Variables and variable types

Variable Type Type Description
Void null Used for functions with no return value or empty parameter
bool, int, float Scalar Scalar data types of floating-point, integer, and boolean.
float, vec2, vec3, vec4 Floating-point vector floating-point variables that contain 1-4 element
int, ivec2, ivec3, ivec4 Integer type vector Integer variables contain 1-4 elements
bool, bvec2, bvec3, bvec4 Boolean vector boolean variable containing 1-4 elements
matNxM nxm A floating-point matrix Floating-point matrix of size nxm
Sampler2D, samplerCube texture handle 2D texture, cube texture handle

Several commonly used methods are as follows:

// scalar
float myFloat = 1.0;
bool myBool = bool(myFloat); // float -> bool

// vector
vec4 myVec4 = vec4(1.0); // myVec4 = {1.0, 1.0, 1.0, 1.0}
vec2 myVec2 = vec2(1.0, 0.5); // myVec2 = {1.0, 0.5}
vec2 temp = vec2(myVec2); // temp = {1.0, 0.5}
myVec4 = vec4(myVec2, temp, 0.0);

// The calculation of vectors and matrices is carried out component by component, therefore, the following two methods can also be used
vec3 myVec3a = myVec2.xyx; // through component access symbol myVec3a = {1.0, 0.5, 1.0}
vec3 myVec3b = vec3(myVec2[0], myVec2[1], myVec2[2]); // through the array myVec3b = {1.0, 0.5, 1.0}

// matrix
mat3 myMat3 = mat3(1.0, 0.0, 0.0, // first column
                   0.0, 1.0, 0.0, // second column
                   0.0, 1.0, 1.0); // third column
mat4 myMat4 = mat4(1.0) // myMat4 = {1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0}

// Interactive use of vectors, scalars and matrices
vec3 v, u;
float f;
v = u + f;

// Equivalent to v.x = u.x + f;
// v.y = u.y + f;
// v.z = u.z + f;

mat3 m;
u = v * m;

// Equivalent to u.x = m[0].x * v.x + m[1].x * v.y + m[2].x * v.z;
// u.y = m[0].y * v.x + m[1].y * v.y + m[2].y * v.z;
// u.z = m[0].z * v.x + m[1].z * v.y + m[2].z * v.z;

2. Qualifier

2.1. Storage qualifier

Symbol type description
Uniform global variables Global variables are assigned before the coloring program runs and are globally effective during the running process.
Attribute attributes and buffers vertex-by-vertex data transmitted by the application to the vertex shader
Varying variable The vertex shader transmits the variable to the interpolated data in the fragment shader.

Instructions:

attribute vec4 a_position;
varying vec4 v_color;

2.2 Parameter qualifier

Symbol Description
in The default qualifier used by in by default, indicating that the parameter is passed the value, and the function does not modify the value passed in (value transfer in C language)
out The value of the out parameter is not passed to the function, but it is modified within the function Value. The value will be modified after the function ends.
inout inout indicates that the parameter passed in is a reference. If the value of the parameter is modified in the function, the value of the parameter will also be modified after the function ends (pass by reference in C language)
vec4 myFunc(inout float myFloat, // input and output parameters
            out vec4 myVec4, // output
            mat4 myMat4); // input parameters

2.3 Precision qualifier

Symbol Description
Highp High precision. Specify the precision of integer or floating-point variables. The shaping range is -2^16 ~ 2^16, and the floating-point type is -2^62 ~ 2^62
Mediump Medium precision. Specify the precision of integer or floating-point variables. The shaping range is -2^10 ~ 2^10, and the floating-point type is -2^14 ~ 2^14
Lowp Low precision. Specify the precision of integer or floating-point variables. The shaping range is -2^8 ~ 2^8, and the floating-point type is -2 ~ 2

Instructions:

// pre-declared
precision highp float;
precision mediump int;

// Specify variable declaration
varying lowp vec4 v_color;

State machine

On WebGL, states can describe most elements, such as whether lighting is enabled, textures are enabled, blending is enabled, and depth testing is enabled. Usually, WebGL will execute the default state unless we call the relevant interface to change it, such as: gl.Enablexxx or gl.Disablexxx.

Context

WebGL needs to rely on the canvas carrier to obtain the corresponding drawing context. The graph context calls the corresponding drawing API, including the various state switches mentioned above. To draw each object, you need to set a series of state values and then run a shader method pair by calling “gl.drawArrays” or “gl.drawElements” so that your shader pair can run on the GPU. The WebGL rendering context is created as follows:

// define a canvas element
var gl = canvas.getContext("webgl");

if(!gl){
    // You can't use WebGL
}

This chapter mainly introduces some basic knowledge of WebGL, including the rendering pipeline process, rendering language, etc. The next chapter introduces the drawing process, focusing on the role of vertex shaders and fragment shaders.

Content reference

  1. WebGL Chinese document

  2. WebGL API comparison table

  3. OpenGL document

  4. Detailed explanation of GLSL

  5. Explain the graphics rendering pipeline in detail (Chinese)

3 Likes