diff --git a/Advanced/Plate.png b/Advanced/Plate.png new file mode 100644 index 0000000..708916c Binary files /dev/null and b/Advanced/Plate.png differ diff --git a/Advanced/artifacts_removeaccessoffset.png b/Advanced/artifacts_removeaccessoffset.png new file mode 100644 index 0000000..9c4b1d8 Binary files /dev/null and b/Advanced/artifacts_removeaccessoffset.png differ diff --git a/Advanced/artifacts_sampleonce.png b/Advanced/artifacts_sampleonce.png new file mode 100644 index 0000000..8e8e33c Binary files /dev/null and b/Advanced/artifacts_sampleonce.png differ diff --git a/Advanced/bandartifacts.png b/Advanced/bandartifacts.png new file mode 100644 index 0000000..c7a891b Binary files /dev/null and b/Advanced/bandartifacts.png differ diff --git a/Advanced/equirectangular.png b/Advanced/equirectangular.png new file mode 100644 index 0000000..6e03f7e Binary files /dev/null and b/Advanced/equirectangular.png differ diff --git a/Advanced/equirectangular_rendering.html b/Advanced/equirectangular_rendering.html index a42f736..b9d6ec9 100644 --- a/Advanced/equirectangular_rendering.html +++ b/Advanced/equirectangular_rendering.html @@ -142,67 +142,73 @@
-

5.3 Equirectangular Rendering

Equirectangular rendering is a useful technique to show 360 panorama images. we often see this technique being used for visualizing interior design. or used for rendering further away background or sky in games. with the increasing popularity of the 360 cameras, this technique becomes very useful.

a typical euirectangular texture looks like the following. it should be familiar to you already, as this is how we flatten the globe. We already know how to pin point a location on this map, given a set of longitude and latitude. This is essentially how we access the equirectuangular texture map, via a spherical coordinates.

Image to Show a Flattened Globe
Image to Show a Flattened Globe

the idea of equirectangular rendering is also not difficult to grasp. normally when we access a texture map, we do so via the uv coordinates. when we perform the equirectangular rendering, we create an imaginary sphere enclosing our camera. imagine the framebuffer being rendered is a plane in front of our camera. through each fragment of this image plane, we shot out a ray from our camera position, passing through each fragment and intersects with the imaginary sphere. The intersection point has a coordinate in the Spherical coordinate system theta, .

Image to Show Shooting Ray Intersecting Global
Image to Show Shooting Ray Intersecting Global

with the spherical coordinates, we can derive a uv texture coordinate, through which, we sample the texture map. we are free to rotate the camera, this will allow us to view all the directions of the 360 image.

    // Vertex shader
-    
-    @group(0) @binding(0)
-    var<uniform> modelView: mat4x4<f32>;
-    @group(0) @binding(1)
-    var<uniform> projection: mat4x4<f32>;
-    
-    struct VertexOutput {
-        @builtin(position) clip_position: vec4<f32>,
-        @location(0) worldPos: vec3<f32>
-    };
-    
-    @vertex
-    fn vs_main(
-        @location(0) inPos: vec3<f32>
-    ) -> VertexOutput {
-        var out: VertexOutput;
-        out.worldPos = inPos;
-        var wldLoc:vec4<f32> = modelView * vec4<f32>(inPos, 1.0);
-        out.clip_position = projection * wldLoc;
-        return out;
-    }
-    
-    // Fragment shader
-
-    const pi:f32 = 3.141592654;
-    @group(0) @binding(2)
-    var t_diffuse: texture_2d<f32>;
-    @group(0) @binding(3)
-    var s_diffuse: sampler;
-
-    @fragment
-    fn fs_main(in: VertexOutput) -> @location(0) vec4<f32> {
-        var n:vec3<f32> = normalize(in.worldPos);
-
-        var len:f32 = sqrt (n.x *n.x + n.y*n.y);
-
-        var s:f32 = acos( n.x / len);
-
-        if (n.y < 0) {
-            s = 2.0 * pi - s;
-        }
-
-        s = s / (2.0 * pi);
-
-        var tex_coord:vec2<f32> = vec2(s , ((asin(n.z) * -2.0 / pi ) + 1.0) * 0.5);
-        return textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
-    }

next, let's look at the shader code. The vertex shader is straightforward. We simply does the basic modelview and projection conversion and in addition, we pass the world coordinates to the fragment shader. this word coordinates will help us derive the spherical coordinates used for sampling the texture map.

in the fragment shader, we retrieve the world coordinates, and we calculate the longitude s based on the following equation:

and the latitude based on: -.

since the latitude is between -pi,pi, we will scale it to 0,1.0. notice that we are flipping the latitude or the v coordinate, because the texture coordinate system and the spherical system are flipped along v or latitude.

and following that is a standard texture sampling.

Example Image of Equirectangular Image
Example Image of Equirectangular Image

another very useful thing about Equirectangular rendering is to render highly reflective surfaces, such as metal or mirror or glass. these surface can reflect surrounding environment. Let's see how to render this type of object.

           
-                let nn:vec3<f32> = reflect(-viewDir, n);
-
-                var len:f32 = sqrt (nn.x *nn.x + nn.y*nn.y);
-                var s:f32 = acos( nn.x / len);
-                if (nn.y < 0) {
-                    s = 2.0 * pi - s;
-                }
-                s = s / (2.0 * pi);
-                var tex_coord:vec2<f32> = vec2(s , ((asin(nn.z) * -2.0 / pi ) + 1.0) * 0.5);
-                var diffuseColor:vec4<f32> =
-                        textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
-

previously, in the shadow map demo, we created a teapot with diffuse surface. This time, instead of using a fixed color for the diffuse channel, we will be sampling the diffuse color from its surrounding environment, i.e. the Equirectangular texture map. the sample direction this time, is the reflection direction calculated by both the surface normal and the viewing direction. The rest of the code is the same as the shadow map example and the previous example, we will skip the details.

here is an output of the above example:

Image to Show the Full Sample
Image to Show the Full Sample

+

5.3 Equirectangular Rendering

Equirectangular rendering is a technique widely used for displaying 360-degree panorama images. This method is particularly popular in interior design visualizations and for rendering distant backgrounds or skies in games. With the rise of 360 cameras, this technique has gained even more relevance.

Launch Playground - 5_03_equalrectangle_rendering

An equirectangular texture is essentially a flattened representation of a spherical surface, similar to how a globe is projected onto a 2D map. If you are familiar with this projection, you know that it maps the globe into a rectangular format where longitude and latitude define specific points on the map. This is the basis for accessing the equirectangular texture map through spherical coordinates.

Equirectangular Projection of the Globe
Equirectangular Projection of the Globe

The concept behind equirectangular rendering is straightforward. Typically, texture mapping is done using UV coordinates. In equirectangular rendering, however, we envision an imaginary sphere surrounding the camera. The framebuffer being rendered acts as a plane in front of the camera. For each fragment on this image plane, a ray is cast from the camera position through the fragment, intersecting the imaginary sphere. This intersection point is defined by spherical coordinates, specifically theta and phi.

Equirectangular Texture Access
Equirectangular Texture Access

Using these spherical coordinates, we can derive UV texture coordinates to sample the equirectangular texture map. This setup allows for the camera to rotate freely, providing a full 360-degree view of the panorama image.

@group(0) @binding(0)
+var<uniform> modelView: mat4x4<f32>;
+@group(0) @binding(1)
+var<uniform> projection: mat4x4<f32>;
+
+struct VertexOutput {
+    @builtin(position) clip_position: vec4<f32>,
+    @location(0) worldPos: vec3<f32>
+};
+
+@vertex
+fn vs_main(
+    @location(0) inPos: vec3<f32>
+) -> VertexOutput {
+    var out: VertexOutput;
+    out.worldPos = inPos;
+    var wldLoc:vec4<f32> = modelView * vec4<f32>(inPos, 1.0);
+    out.clip_position = projection * wldLoc;
+    return out;
+}
+
+// Fragment shader
+
+const pi:f32 = 3.141592654;
+@group(0) @binding(2)
+var t_diffuse: texture_2d<f32>;
+@group(0) @binding(3)
+var s_diffuse: sampler;
+
+@fragment
+fn fs_main(in: VertexOutput) -> @location(0) vec4<f32> {
+    var n:vec3<f32> = normalize(in.worldPos);
+
+    var len:f32 = sqrt (n.x *n.x + n.y*n.y);
+
+    var s:f32 = acos( n.x / len);
+    if (n.y < 0) {
+        s = 2.0 * pi - s;
+    }
+
+    s = s / (2.0 * pi);
+    var tex_coord:vec2<f32> = vec2(s , ((asin(n.z) * -2.0 / pi ) + 1.0) * 0.5);
+    return textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
+}
+

Let's now examine the shader code used for equirectangular rendering. The vertex shader performs standard model-view and projection transformations, and it also passes world coordinates to the fragment shader. These world coordinates are essential for deriving spherical coordinates used to sample the equirectangular texture map.

In the fragment shader, we start by retrieving the world coordinates. We then calculate the longitude and latitude. The longitude is derived from:

var n:vec3<f32> = normalize(in.worldPos);
+
+var len:f32 = sqrt (n.x *n.x + n.y*n.y);
+
+var s:f32 = acos( n.x / len);
+

and the latitude from:

if (n.y < 0) {
+    s = 2.0 * pi - s;
+}
+
+s = s / (2.0 * pi);
+

Since latitude ranges from -π to π, we scale it to the range of 0 to 1. Additionally, we flip the latitude or V coordinate, as the texture coordinate system and the spherical coordinate system have opposite orientations along V or latitude.

Texture sampling is performed as usual after these calculations.

Example Image of Equirectangular Image
Example Image of Equirectangular Image

Equirectangular rendering is also highly effective for rendering reflective surfaces like metal, mirrors, or glass. These surfaces reflect their surrounding environment, and equirectangular textures can be used to capture this reflection. Here’s how you can implement this:

let nn:vec3<f32> = reflect(-viewDir, n);
+
+var len:f32 = sqrt (nn.x *nn.x + nn.y*nn.y);
+var s:f32 = acos( nn.x / len);
+if (nn.y < 0) {
+    s = 2.0 * pi - s;
+}
+s = s / (2.0 * pi);
+var tex_coord:vec2<f32> = vec2(s , ((asin(nn.z) * -2.0 / pi ) + 1.0) * 0.5);
+var diffuseColor:vec4<f32> =
+        textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
+

In this code snippet, we first compute the reflection direction based on the surface normal and viewing direction. The longitude (s) and latitude are derived from this reflection vector. The latitude is scaled to fit the texture coordinate system. We then sample the diffuse color from the equirectangular texture using these coordinates.

In contrast to our previous shadow map demo, where a fixed color was used, here we sample the diffuse color from the surrounding environment represented by the equirectangular texture. The process remains similar to the shadow map example, so we’ll skip the detailed explanation.

Here is the output of the above example:

The Result of the Rendering
The Result of the Rendering