Sunday, July 10, 2022

Graphics Pipeline



Overview
A scene consists of  a set of  3D objects. Transformations such as translation, scaling and rotation as a result of Camera movement, mouse and keyboard input brings them into life. 
For example  the following diagram shows a multi color cube with  50 degrees pitch and 20 degrees yaw. The cube looks elongated horizontally because aspect ratio is not applied. 



Objects in a 3D scene are typically described using triangles, which in turn are defined by their vertices.
A vertex is the corner of the triangle where two edges meet, and thus every triangle is composed of three vertices. Modern OpenGL supports three types of primitives - Points(GL_POINTS), Triangle(GL_TRIANGLE)  and Line Strip (GL_LINE_STRIP).

In OpenGL, the Graphics pipeline is responsible for rendering 3D objects. The following gives a brief overview without over burdening with the complex  details. As you get familiar, this can be revisited to gain a deeper understanding.

FrameBuffer
The output of graphics pipeline ends up in in Framebuffer. Framebuffer is piece of memory within the graphics card that maps to the display. For simplicity, you can assume that it's like a bitmap covering entire viewport. Double framebuffers are used to avoid screen tearing while rendering the scene. For example, after the first frame of a scene is written by the pipeline into framebuffer1, it's drawn on the screen while the framebuffer2 is filled with the second frame of the scene. Then it's swapped with the first so that  the screen now displays the second frame while third frame is written in to the framebuffer1 etc.

OpenGL  Graphics pipeline
OpenGL provides a multi-stage graphics pipeline that is partially programmable using a language called GLSL (OpenGL Shading Language) as shown below. Each of these programmable units are called shaders.


To kick off this chain, the C++ application supplies vertex data consisting of  vertices. 
For a triangle primitive, each vertex maps following information:
  • X, Y, Z coordinate (mandatory)
  • Color (optional)
  • Normal vector (optional)
  • Texture coordinates (optional)
The vertex data is first fed to the vertex shader.

Vertex Shader
This stage is mandatory. A vertex shader is a graphics processing function used to add special effects to objects in a 3D environment by performing mathematical operations on the  vertex's data. Vertex Shaders don't actually change the type of data; they simply change the values of the data, so that a vertex emerges with a different color, different textures, or a different position in 3D space.

Tessellation Shader
This stage is optional and not available to OpenGL release 3.3.  After the vertex shader has processed each vertex’s associated data, the tessellation shader stage will continue processing those data, if it has been activated. Tessellation uses patchs to describe an object’s shape, and allows relatively simple collections of patch geometry to be tessellated to increase the number of geometric primitives providing better-looking models. The tessellation shading stage can potentially use two shaders to manipulate the patch data and generate the final shape.

Geometry Shader
This stage is optional. Allows additional processing of individual geometric primitives, including creating new ones, before rasterization. This shading stage is also optional, but very powerful.

Rasterization
The primitive assembly stage organizes the vertices into their associated geometric primitives in preparation for clipping and rasterization. Clipping removes all pixels outside of the viewport.
After clipping, the updated primitives are sent to the rasterizer for fragment generation. Consider a fragment a candidate pixel, in that pixels have a home in the framebuffer, while a fragment still can be rejected and never update its associated pixel location. Processing of fragments occurs in the next two stages, fragment shading and per-fragment operations.

Fragment Shader
This stage is necessary for the practical reasons. Fragment shader determines the fragment’s final color , and potentially its depth value. Fragment shaders are very powerful as they often employ texture mapping to augment the colors provided by the vertex processing stages. A fragment shader may also terminate processing a fragment if it determines the fragment shouldn’t be drawn; this process is called fragment discard.

Pixel Operations
A fragment’s visibility is determined using depth testing (also commonly known as Z-buffering) and stencil testing. If a fragment successfully makes it through all of the enabled tests, it may
be written directly to the framebuffer, updating the color (and possibly depth value) of its pixel, or if blending is enabled, the fragment’s color will be combined with the pixel’s current color to generate a new color that is written into the framebuffer.

Below we will discuss some of the  terminologies used commonly when vertex data of 3D objects is passed to the pipeline especially vertex shader to render them on screen. Don't get bogged down with its details. It will become clear in the next post. 

Vertex Buffer Object(VBO)
A vertex buffer object (VBO) is an OpenGL feature that provides methods for uploading vertex data (position, normal vector, color, Texture coordinate). A VBO can store one or more of these. 

Element Buffer Object(EBO)
To optimize data transfer, EBO stores indices of the vertices specifically, and OpenGL calls the indices  of these vertices to determine which vertices should be drawn. 

Vertex Array Object(VAO)
Vertex array objects store a set of buffer names - VBO and EBO to get vertex data from, as well as how the vertices are laid out in the vertex buffers. The vertex id associated with VAO maps to location defined in the vertex shader.

The following diagram depicts the relationships between VAO and VBO.



Shaders
A Shader is a user-defined program designed to run on some stage of a graphics processor. Shaders provide the code for certain programmable stages of the rendering pipeline. 
Shaders are written in the OpenGL Shading Language. The OpenGL rendering pipeline defines the following shader stages, with their enumerator name:
Vertex Shaders( GL_VERTEX_SHADER)
Geometry Shaders (GL_GEOMETRY_SHADER)
Fragment Shaders (GL_FRAGMENT_SHADER)
A program object can combine multiple shader stages (built from shader objects) into a single, linked whole. A program pipeline object can combine programs that contain individual shader stages into a whole pipeline.

Uniforms
Data such as vectors, matrices can be shared across the C++ application and shaders using a uniform.
A uniform is a global Shader variable declared with the "uniform" storage qualifier. These act as parameters that the user of a shader program can pass to that program. Their values are stored in a program object.
Uniforms are so named because they do not change from one shader invocation to the next within a particular rendering call thus their value is uniform among all invocations. This makes them unlike shader stage inputs and outputs, which are often different for each invocation of a shader stage.

The following diagram combines all the above and represents it,

Coordinate System
OpenGL uses Right hand side coordinate system where +Z is pointing towards you.
The following diagram shows how the positive rotation happens in Counter clockwise direction.
        Pitch(X Axis)                                    Yaw (Y Axis)                                  Roll (Z Axis)


In the next post we will discuss the nitty gritty of rendering a cube.

No comments:

Post a Comment