Skip to content

Commit

Permalink
update advanced
Browse files Browse the repository at this point in the history
  • Loading branch information
shi-yan committed Aug 4, 2024
1 parent b4dd68f commit f8eddb8
Show file tree
Hide file tree
Showing 24 changed files with 223 additions and 226 deletions.
Binary file added Advanced/Plate.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added Advanced/artifacts_removeaccessoffset.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added Advanced/artifacts_sampleonce.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added Advanced/bandartifacts.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added Advanced/equirectangular.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
128 changes: 67 additions & 61 deletions Advanced/equirectangular_rendering.html
Original file line number Diff line number Diff line change
Expand Up @@ -142,67 +142,73 @@

<div id="article-container">
<article>
<h2 >5.3 Equirectangular Rendering</h2><p>Equirectangular rendering is a useful technique to show 360 panorama images. we often see this technique being used for visualizing interior design. or used for rendering further away background or sky in games. with the increasing popularity of the 360 cameras, this technique becomes very useful.</p><p>a typical euirectangular texture looks like the following. it should be familiar to you already, as this is how we flatten the globe. We already know how to pin point a location on this map, given a set of longitude and latitude. This is essentially how we access the equirectuangular texture map, via a spherical coordinates.</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="placeholder.jpg" alt="Image to Show a Flattened Globe" sources='[]' /><div class="img-title">Image to Show a Flattened Globe</div></div></p><p>the idea of equirectangular rendering is also not difficult to grasp. normally when we access a texture map, we do so via the uv coordinates. when we perform the equirectangular rendering, we create an imaginary sphere enclosing our camera. imagine the framebuffer being rendered is a plane in front of our camera. through each fragment of this image plane, we shot out a ray from our camera position, passing through each fragment and intersects with the imaginary sphere. The intersection point has a coordinate in the Spherical coordinate system theta, .</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="placeholder.jpg" alt="Image to Show Shooting Ray Intersecting Global" sources='[]' /><div class="img-title">Image to Show Shooting Ray Intersecting Global</div></div></p><p>with the spherical coordinates, we can derive a uv texture coordinate, through which, we sample the texture map. we are free to rotate the camera, this will allow us to view all the directions of the 360 image.</p><pre><code class="language-javascript code-block"> // Vertex shader

@group(0) @binding(0)
var&lt;uniform&gt; modelView: mat4x4&lt;f32&gt;;
@group(0) @binding(1)
var&lt;uniform&gt; projection: mat4x4&lt;f32&gt;;

struct VertexOutput {
@builtin(position) clip_position: vec4&lt;f32&gt;,
@location(0) worldPos: vec3&lt;f32&gt;
};

@vertex
fn vs_main(
@location(0) inPos: vec3&lt;f32&gt;
) -&gt; VertexOutput {
var out: VertexOutput;
out.worldPos = inPos;
var wldLoc:vec4&lt;f32&gt; = modelView * vec4&lt;f32&gt;(inPos, 1.0);
out.clip_position = projection * wldLoc;
return out;
}

// Fragment shader

const pi:f32 = 3.141592654;
@group(0) @binding(2)
var t_diffuse: texture_2d&lt;f32&gt;;
@group(0) @binding(3)
var s_diffuse: sampler;

@fragment
fn fs_main(in: VertexOutput) -&gt; @location(0) vec4&lt;f32&gt; {
var n:vec3&lt;f32&gt; = normalize(in.worldPos);

var len:f32 = sqrt (n.x *n.x + n.y*n.y);

var s:f32 = acos( n.x / len);

if (n.y &lt; 0) {
s = 2.0 * pi - s;
}

s = s / (2.0 * pi);

var tex_coord:vec2&lt;f32&gt; = vec2(s , ((asin(n.z) * -2.0 / pi ) + 1.0) * 0.5);
return textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
}</code></pre><p>next, let's look at the shader code. The vertex shader is straightforward. We simply does the basic modelview and projection conversion and in addition, we pass the world coordinates to the fragment shader. this word coordinates will help us derive the spherical coordinates used for sampling the texture map.</p><p>in the fragment shader, we retrieve the world coordinates, and we calculate the longitude s based on the following equation:</p><p>and the latitude based on:
.</p><p>since the latitude is between -pi,pi, we will scale it to 0,1.0. notice that we are flipping the latitude or the v coordinate, because the texture coordinate system and the spherical system are flipped along v or latitude.</p><p>and following that is a standard texture sampling.</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="placeholder.jpg" alt="Example Image of Equirectangular Image" sources='[]' /><div class="img-title">Example Image of Equirectangular Image</div></div></p><p>another very useful thing about Equirectangular rendering is to render highly reflective surfaces, such as metal or mirror or glass. these surface can reflect surrounding environment. Let's see how to render this type of object.</p><pre><code class="language-javascript code-block">
let nn:vec3&lt;f32&gt; = reflect(-viewDir, n);

var len:f32 = sqrt (nn.x *nn.x + nn.y*nn.y);
var s:f32 = acos( nn.x / len);
if (nn.y &lt; 0) {
s = 2.0 * pi - s;
}
s = s / (2.0 * pi);
var tex_coord:vec2&lt;f32&gt; = vec2(s , ((asin(nn.z) * -2.0 / pi ) + 1.0) * 0.5);
var diffuseColor:vec4&lt;f32&gt; =
textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
</code></pre><p>previously, in the shadow map demo, we created a teapot with diffuse surface. This time, instead of using a fixed color for the diffuse channel, we will be sampling the diffuse color from its surrounding environment, i.e. the Equirectangular texture map. the sample direction this time, is the reflection direction calculated by both the surface normal and the viewing direction. The rest of the code is the same as the shadow map example and the previous example, we will skip the details.</p><p>here is an output of the above example:</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="placeholder.jpg" alt="Image to Show the Full Sample" sources='[]' /><div class="img-title">Image to Show the Full Sample</div></div></p>
<h2 >5.3 Equirectangular Rendering</h2><p>Equirectangular rendering is a technique widely used for displaying 360-degree panorama images. This method is particularly popular in interior design visualizations and for rendering distant backgrounds or skies in games. With the rise of 360 cameras, this technique has gained even more relevance.</p><a href="https://shi-yan.github.io/WebGPUUnleashed/code/code.html#5_03_equalrectangle_rendering" target="_blank" class="comment"><svg style="margin-right:10px;vertical-align: middle;" xmlns="http://www.w3.org/2000/svg" height="32"
width="32" fill="#dadadb" viewBox="0 -960 960 960"><path d="M189-160q-60 0-102.5-43T42-307q0-9 1-18t3-18l84-336q14-54 57-87.5t98-33.5h390q55 0 98 33.5t57 87.5l84 336q2 9 3.5 18.5T919-306q0 61-43.5 103.5T771-160q-42 0-78-22t-54-60l-28-58q-5-10-15-15t-21-5H385q-11 0-21 5t-15 15l-28 58q-18 38-54 60t-78 22Zm3-80q19 0 34.5-10t23.5-27l28-57q15-31 44-48.5t63-17.5h190q34 0 63 18t45 48l28 57q8 17 23.5 27t34.5 10q28 0 48-18.5t21-46.5q0 1-2-19l-84-335q-7-27-28-44t-49-17H285q-28 0-49.5 17T208-659l-84 335q-2 6-2 18 0 28 20.5 47t49.5 19Zm348-280q17 0 28.5-11.5T580-560q0-17-11.5-28.5T540-600q-17 0-28.5 11.5T500-560q0 17 11.5 28.5T540-520Zm80-80q17 0 28.5-11.5T660-640q0-17-11.5-28.5T620-680q-17 0-28.5 11.5T580-640q0 17 11.5 28.5T620-600Zm0 160q17 0 28.5-11.5T660-480q0-17-11.5-28.5T620-520q-17 0-28.5 11.5T580-480q0 17 11.5 28.5T620-440Zm80-80q17 0 28.5-11.5T740-560q0-17-11.5-28.5T700-600q-17 0-28.5 11.5T660-560q0 17 11.5 28.5T700-520Zm-360 60q13 0 21.5-8.5T370-490v-40h40q13 0 21.5-8.5T440-560q0-13-8.5-21.5T410-590h-40v-40q0-13-8.5-21.5T340-660q-13 0-21.5 8.5T310-630v40h-40q-13 0-21.5 8.5T240-560q0 13 8.5 21.5T270-530h40v40q0 13 8.5 21.5T340-460Zm140-20Z"/></svg>Launch Playground - 5_03_equalrectangle_rendering</a><p>An equirectangular texture is essentially a flattened representation of a spherical surface, similar to how a globe is projected onto a 2D map. If you are familiar with this projection, you know that it maps the globe into a rectangular format where longitude and latitude define specific points on the map. This is the basis for accessing the equirectangular texture map through spherical coordinates.</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="thumb_Plate.png" original_src="Plate.png" alt="Equirectangular Projection of the Globe" sources='[]' /><div class="img-title">Equirectangular Projection of the Globe</div></div></p><p>The concept behind equirectangular rendering is straightforward. Typically, texture mapping is done using UV coordinates. In equirectangular rendering, however, we envision an imaginary sphere surrounding the camera. The framebuffer being rendered acts as a plane in front of the camera. For each fragment on this image plane, a ray is cast from the camera position through the fragment, intersecting the imaginary sphere. This intersection point is defined by spherical coordinates, specifically theta and phi.</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="thumb_equirectangular.png" original_src="equirectangular.png" alt="Equirectangular Texture Access" sources='[]' /><div class="img-title">Equirectangular Texture Access</div></div></p><p>Using these spherical coordinates, we can derive UV texture coordinates to sample the equirectangular texture map. This setup allows for the camera to rotate freely, providing a full 360-degree view of the panorama image.</p><div class="code-fragments"><pre><code class="language-javascript code-block" startNumber=212>@group(0) @binding(0)
var&lt;uniform&gt; modelView: mat4x4&lt;f32&gt;;
@group(0) @binding(1)
var&lt;uniform&gt; projection: mat4x4&lt;f32&gt;;

struct VertexOutput {
@builtin(position) clip_position: vec4&lt;f32&gt;,
@location(0) worldPos: vec3&lt;f32&gt;
};

@vertex
fn vs_main(
@location(0) inPos: vec3&lt;f32&gt;
) -&gt; VertexOutput {
var out: VertexOutput;
out.worldPos = inPos;
var wldLoc:vec4&lt;f32&gt; = modelView * vec4&lt;f32&gt;(inPos, 1.0);
out.clip_position = projection * wldLoc;
return out;
}

// Fragment shader

const pi:f32 = 3.141592654;
@group(0) @binding(2)
var t_diffuse: texture_2d&lt;f32&gt;;
@group(0) @binding(3)
var s_diffuse: sampler;

@fragment
fn fs_main(in: VertexOutput) -&gt; @location(0) vec4&lt;f32&gt; {
var n:vec3&lt;f32&gt; = normalize(in.worldPos);

var len:f32 = sqrt (n.x *n.x + n.y*n.y);

var s:f32 = acos( n.x / len);
if (n.y &lt; 0) {
s = 2.0 * pi - s;
}

s = s / (2.0 * pi);
var tex_coord:vec2&lt;f32&gt; = vec2(s , ((asin(n.z) * -2.0 / pi ) + 1.0) * 0.5);
return textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
}
</pre></code><div class="code-fragments-caption"><a target="_blank" href="https://shi-yan.github.io/WebGPUUnleashed/code/code.html?highlight=211:254#5_03_equalrectangle_rendering">5_03_equalrectangle_rendering/index.html:212-255 Shader Samples From Equirectangular Texture</a></div></div><p>Let's now examine the shader code used for equirectangular rendering. The vertex shader performs standard model-view and projection transformations, and it also passes world coordinates to the fragment shader. These world coordinates are essential for deriving spherical coordinates used to sample the equirectangular texture map.</p><p>In the fragment shader, we start by retrieving the world coordinates. We then calculate the longitude and latitude. The longitude is derived from:</p><div class="code-fragments"><pre><code class="language-javascript code-block" startNumber=243>var n:vec3&lt;f32&gt; = normalize(in.worldPos);

var len:f32 = sqrt (n.x *n.x + n.y*n.y);

var s:f32 = acos( n.x / len);
</pre></code><div class="code-fragments-caption"><a target="_blank" href="https://shi-yan.github.io/WebGPUUnleashed/code/code.html?highlight=242:246#5_03_equalrectangle_rendering">5_03_equalrectangle_rendering/index.html:243-247 Calculating Longitude</a></div></div><p>and the latitude from:</p><div class="code-fragments"><pre><code class="language-javascript code-block" startNumber=248>if (n.y &lt; 0) {
s = 2.0 * pi - s;
}

s = s / (2.0 * pi);
</pre></code><div class="code-fragments-caption"><a target="_blank" href="https://shi-yan.github.io/WebGPUUnleashed/code/code.html?highlight=247:251#5_03_equalrectangle_rendering">5_03_equalrectangle_rendering/index.html:248-252 Calculating Latitude</a></div></div><p>Since latitude ranges from -π to π, we scale it to the range of 0 to 1. Additionally, we flip the latitude or V coordinate, as the texture coordinate system and the spherical coordinate system have opposite orientations along V or latitude.</p><p>Texture sampling is performed as usual after these calculations.</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="thumb_parking_lot.jpg" original_src="parking_lot.jpg" alt="Example Image of Equirectangular Image" sources='[]' /><div class="img-title">Example Image of Equirectangular Image</div></div></p><p>Equirectangular rendering is also highly effective for rendering reflective surfaces like metal, mirrors, or glass. These surfaces reflect their surrounding environment, and equirectangular textures can be used to capture this reflection. Here’s how you can implement this:</p><div class="code-fragments"><pre><code class="language-javascript code-block" startNumber=142>let nn:vec3&lt;f32&gt; = reflect(-viewDir, n);

var len:f32 = sqrt (nn.x *nn.x + nn.y*nn.y);
var s:f32 = acos( nn.x / len);
if (nn.y &lt; 0) {
s = 2.0 * pi - s;
}
s = s / (2.0 * pi);
var tex_coord:vec2&lt;f32&gt; = vec2(s , ((asin(nn.z) * -2.0 / pi ) + 1.0) * 0.5);
var diffuseColor:vec4&lt;f32&gt; =
textureSampleLevel(t_diffuse, s_diffuse, tex_coord, 0);
</pre></code><div class="code-fragments-caption"><a target="_blank" href="https://shi-yan.github.io/WebGPUUnleashed/code/code.html?highlight=141:151#5_03_equalrectangle_rendering">5_03_equalrectangle_rendering/index.html:142-152 Shader Reflect Environment Map</a></div></div><p>In this code snippet, we first compute the reflection direction based on the surface normal and viewing direction. The longitude (s) and latitude are derived from this reflection vector. The latitude is scaled to fit the texture coordinate system. We then sample the diffuse color from the equirectangular texture using these coordinates.</p><p>In contrast to our previous shadow map demo, where a fixed color was used, here we sample the diffuse color from the surrounding environment represented by the equirectangular texture. The process remains similar to the shadow map example, so we’ll skip the detailed explanation.</p><p>Here is the output of the above example:</p><p><div class="img-container"><img class="img" onclick="openImage(this)" src="thumb_equirectangular_result.png" original_src="equirectangular_result.png" alt="The Result of the Rendering" sources='[]' /><div class="img-title">The Result of the Rendering</div></div></p>
</article>

<div class="older_newer_link_section">
Expand Down
Binary file added Advanced/equirectangular_result.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added Advanced/parking_lot.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit f8eddb8

Please sign in to comment.