-
-
Notifications
You must be signed in to change notification settings - Fork 35.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebGPURenderer: Add HDR Support #29573
base: dev
Are you sure you want to change the base?
Conversation
📦 Bundle sizeFull ESM build, minified and gzipped.
🌳 Bundle size after tree-shakingMinimal build including a renderer, camera, empty scene, and dependencies.
|
Related: https://github.com/ccameron-chromium/webgpu-hdr/blob/main/EXPLAINER.md So I guess when using renderer = new THREE.WebGPURenderer( { antialias: true, hdr: true } );
renderer.toneMapping = THREE.ACESFilmicToneMapping; // -> produces warning What about To clarify: |
I agree with the proposed solution. @CodyJasonBennett warned me about the toneMapping issue and we could indeed probably just warn the developer about the fact that both |
Your intuition is correct. When rendering out in HDR, you send the physical/lighting units (candelas, nits). The display does the rest, including conversion into the electric signal used by the display, which is what
I've chatted with Don about this since I'm eager to see a real comparison with tonemapping in LDR and HDR (simply disabling tonemapping doesn't compare), but it's a lot of work to implement still. I'm happy to upstream tonemappers here if we can figure out an API. Just a lot of unknowns on top of historical problems and inconsistencies from display manufacturers, which complicate this. I'd be more confident with an API once we have direction here. |
if ( this.backend.parameters.hdr === true ) { | ||
|
||
return GPUTextureFormat.RGBA16Float; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we expose RGBA16 drawing buffers independently of the HDR parameter? Perhaps something like drawingBufferType
or canvasType
, with possible values of HalfFloatType or UnsignedByteType?
There will certainly be interest in using high-precision output without HDR, and I believe it's also good to make HDR usage explicit about the drawing buffer configuration.
Some caution — our output to the drawing buffer must still be a formed image, we cannot send the stimulus (i.e. scene lighting / luminance) directly to the drawing buffer; the browser/device/OS does not provide image formation any more in “HDR” than in “SDR”. A lot of recent HDR demos on social media have made this mistake by omitting tone mapping. We do still want to consider output units (likely nits), as they relate to the viewing device/display. WebGPU HDR, as currently shipped in Chrome, tells us nothing about the display, so we are guessing heavily. The amount of HDR headroom available may vary: turn your laptop brightness up, headroom reduces, different color management may be required. This is a major gap in the current WebGPU implementation in Chrome, and something we may need to keep tabs on for changes. And as @CodyJasonBennett says well, “a lot of unknowns on top of historical problems and inconsistencies” exist outside of Chrome's control. I have a general idea of how to adapt AgX or ACES Filmic for use with “HDR“ output, and I'll look into that a bit. Desaturation is fortunately orthogonal: representation as “SDR” vs. “HDR” does not imply any large difference in saturation. If the comparison does diverge then something is likely wrong.1
A quick test here would be to render a solid MeshBasicMaterial (example: Footnotes
|
Possible API: import { ExtendedSGRBColorSpace } from 'three/addons/math/ColorSpaces.js';
const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });
renderer.outputColorSpace = ExtendedSGRBColorSpace; My main concern: Extended sRGB (the only option WebGPU currently allows) is not really an appropriate output color space for lit rendering; we need to render an “HDR” image in reference to a well-defined display/medium. I'll file an issue on the WebGPU spec repo about this (EDIT: gpuweb/gpuweb#4919); perhaps there are plans to enable other output color spaces. I would prefer to have this: import { Rec2100HLGColorSpace } from 'three/addons/math/ColorSpaces.js';
const renderer = new WebGPURenderer({ outputType: THREE.HalfFloatType });
renderer.outputColorSpace = Rec2100HLGColorSpace; Adaptations to tone mapping are also needed, though they depend on information we do not have, and which may not be well-defined at all when using Extended sRGB. I know others are excited about the “HDR” features though — would it be possible to start with a PR that exposes |
Description
Enjoy Threejs in HDR 🚀 (requires WebGPU support and an HDR-capable monitor).
Check out the difference:
HDR:
https://raw.githack.com/renaudrohlinger/three.js/utsubo/feat/hdr/examples/webgpu_tsl_vfx_linkedparticles.html
SDR:
https://threejs.org/examples/webgpu_tsl_vfx_linkedparticles.html
This contribution is funded by Utsubo