Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Merged by Bors] - Fix color banding by dithering image before quantization #5264

Closed
wants to merge 7 commits into from

Conversation

aevyrie
Copy link
Member

@aevyrie aevyrie commented Jul 9, 2022

Objective

Solution

Screenshot from 2022-11-10 13-44-46
Screenshot from 2022-11-10 13-41-11

Additional Notes

Real time rendering to standard dynamic range outputs is limited to 8 bits of depth per color channel. Internally we keep everything in full 32-bit precision (vec4<f32>) inside passes and 16-bit between passes until the image is ready to be displayed, at which point the GPU implicitly converts our vec4<f32> into a single 32bit value per pixel, with each channel (rgba) getting 8 of those 32 bits.

The Problem

8 bits of color depth is simply not enough precision to make each step invisible - we only have 256 values per channel! Human vision can perceive steps in luma to about 14 bits of precision. When drawing a very slight gradient, the transition between steps become visible because with a gradient, neighboring pixels will all jump to the next "step" of precision at the same time.

The Solution

One solution is to simply output in HDR - more bits of color data means the transition between bands will become smaller. However, not everyone has hardware that supports 10+ bit color depth. Additionally, 10 bit color doesn't even fully solve the issue, banding will result in coherent bands on shallow gradients, but the steps will be harder to perceive.

The solution in this PR adds noise to the signal before it is "quantized" or resampled from 32 to 8 bits. Done naively, it's easy to add unneeded noise to the image. To ensure dithering is correct and absolutely minimal, noise is adding within one step of the output color depth. When converting from the 32bit to 8bit signal, the value is rounded to the nearest 8 bit value (0 - 255). Banding occurs around the transition from one value to the next, let's say from 50-51. Dithering will never add more than +/-0.5 bits of noise, so the pixels near this transition might round to 50 instead of 51 but will never round more than one step. This means that the output image won't have excess variance:

  • in a gradient from 49 to 51, there will be a step between each band at 49, 50, and 51.
  • Done correctly, the modified image of this gradient will never have a adjacent pixels more than one step (0-255) from each other.
  • I.e. when scanning across the gradient you should expect to see:
                  |-band-| |-band-| |-band-|
Baseline:         49 49 49 50 50 50 51 51 51
Dithered:         49 50 49 50 50 51 50 51 51
Dithered (wrong): 49 50 51 49 50 51 49 51 50

Screenshot from 2022-11-10 14-12-36
Screenshot from 2022-11-10 14-11-48

You can see from above how correct dithering "fuzzes" the transition between bands to reduce distinct steps in color, without adding excess noise.

HDR

The previous section (and this PR) assumes the final output is to an 8-bit texture, however this is not always the case. When Bevy adds HDR support, the dithering code will need to take the per-channel depth into account instead of assuming it to be 0-255. Edit: I talked with Rob about this and it seems like the current solution is okay. We may need to revisit once we have actual HDR final image output.


Changelog

Added

  • All pipelines now support deband dithering. This is enabled by default in 3D, and can be toggled in the Tonemapping component in camera bundles. Banding is a graphical artifact created when the rendered image is crunched from high precision (f32 per color channel) down to the final output (u8 per channel in SDR). This results in subtle gradients becoming blocky due to the reduced color precision. Deband dithering applies a small amount of noise to the signal before it is "crunched", which breaks up the hard edges of blocks (bands) of color. Note that this does not add excess noise to the image, as the amount of noise is less than a single step of a color channel - just enough to break up the transition between color blocks in a gradient.

@aevyrie aevyrie added the A-Rendering Drawing game state to the screen label Jul 9, 2022
@alice-i-cecile
Copy link
Member

alice-i-cecile commented Jul 9, 2022

Please note that this fixes #5262 while filling out the PR description :)

@alice-i-cecile alice-i-cecile added the C-Bug An unexpected or incorrect behavior label Jul 9, 2022
@alice-i-cecile
Copy link
Member

Taking a look at the before / after screen shots, I'm strongly in favor of this being enabled by default. The banding was very noticeable before, and the new dithering pattern is much cleaner.

First impressions matter, and the behavior out-of-the-box should work well for most users, especially beginners.

@aevyrie aevyrie changed the title FIx color banding with dithering in PBR shader Fix color banding with dithering in PBR shader Jul 9, 2022
@aevyrie
Copy link
Member Author

aevyrie commented Jul 10, 2022

FYI I'm planning on making an example to better stress test noise and banding performance before removing draft status.

@aevyrie
Copy link
Member Author

aevyrie commented Jul 11, 2022

Added a stress test for banding (note, be sure to view these at 100% scale):

Control

Screenshot from 2022-07-10 22-38-39

Dithered

Screenshot from 2022-07-10 22-40-05

@aevyrie aevyrie force-pushed the color-banding-fix branch from 8d3439c to daacd0c Compare July 11, 2022 17:23
@aevyrie
Copy link
Member Author

aevyrie commented Jul 11, 2022

@CptPotato I rebased on #3425, and added it to the tonemapping pass. This means dithering is now a fullscreen post-process and should work with images/ui/2d/etc that suffer from quantization banding. Could probably improve by moving to its own node that outputs to a Bgra8Unorm (instead of SRGB, I think this is correct?) to remove the unnecessary conversion to and from SRGB space.

@superdump could you provide some input on whether you see this as a feasible path forward?

@aevyrie aevyrie changed the title Fix color banding with dithering in PBR shader Fix color banding by dithering image before quantization Jul 11, 2022
@tim-blackbird
Copy link
Contributor

tim-blackbird commented Jul 11, 2022

As a note to others viewing the images above. If you still see banding in the dithered image click on it to view it in another tab.
That will display the image at 100% scaling and the banding should not be visible.

@CptPotato
Copy link
Contributor

CptPotato commented Jul 12, 2022

@CptPotato I rebased on #3425, and added it to the tonemapping pass. This means dithering is now a fullscreen post-process and should work with images/ui/2d/etc that suffer from quantization banding. Could probably improve by moving to its own node that outputs to a Bgra8Unorm (instead of SRGB, I think this is correct?) to remove the unnecessary conversion to and from SRGB space.

The result looks good!
If tonemapping is disabled, will dithering be disabled too?


Regarding dithering as a separate node: I think that might be a bit overkill for the actual "work" that would be done in the dither shader. As far as I can tell a separate node will require an additional intermediate render target RT_2 to read from:
main_pass -> [RT_1] -> tonemap -> [RT_2] -> dither -> [RT_SWAP_CHAIN]
On top of that, the render target has to maintain the high bit-depth even after compressing to SDR in the tonemapping stage or dithering will not do anything.

I don't have much wgpu knowledge, but in other apis such as dx11 the conversion to and from SRGB inside the shader is rarely required. Instead, texture views (and render target views) specify this through their format (like you've mentioned with Bgra8Unorm vs Bgra8UnormSrgb).
Edit: looking at it again, I think there's no way around the conversion to srgb if dithering is done in the tonemapping shader (which is not a big issue imo). However you could probably eliminate the conversion back to linear by changing the format of the render target view ("attachment" I think is the wgpu slang). Though, in that case a more accurate conversion function than pow(col, 1.0/2.2) would make sense.

@aevyrie
Copy link
Member Author

aevyrie commented Jul 12, 2022

The result looks good! If tonemapping is disabled, will dithering be disabled too?

Yup, that's correct. Maybe that make sense though? The goal of dithering is to remove artifacts from quantization, which seems to fit alongside tonemapping (mapping between color spaces).

[...] you could probably eliminate the conversion back to linear by changing the format of the render target view ("attachment" I think is the wgpu slang).

The issue with this is device compatibility from what @superdump mentioned on discord.

Though, in that case a more accurate conversion function than pow(col, 1.0/2.2) would make sense.

Any resources on what that would be?

@CptPotato
Copy link
Contributor

CptPotato commented Jul 12, 2022

Thanks for the clarification.

Any resources on what that would be?

A more accurate SRGB to linear (and vice versa) transform is defined piece-wise: It uses an exponent of 2.4 with a short linear toe. This should be the same transformation that's done by the driver/hardware when using an Srgb texture format.

  • Wikipedia describes the transformation pretty clearly. (see Clinear and CsRGB equations)
  • Here's an implementation in glsl.
  • This is basically the same used in Godot.

That being said, if you keep the conversion in both directions (linear -> srgb -> linear) in the shader the difference it makes on the dithering may be negligible. I can't say for sure, though.

@aevyrie
Copy link
Member Author

aevyrie commented Jul 12, 2022

Thanks for the links!

That being said, if you keep the conversion in both directions (linear -> srgb -> linear) in the shader the difference it makes on the dithering may be negligible. I can't say for sure, though.

This is a good point. Dithering just relies on the rounding being correct, such that dithering never causes more than one full step away from the true signal value during quantization. As long as the log approximation is "pretty close" (a rigorous technical term), it wouldn't cause quantization to round another full step for the large majority of cases.

@superdump
Copy link
Contributor

[...] you could probably eliminate the conversion back to linear by changing the format of the render target view ("attachment" I think is the wgpu slang).

The issue with this is device compatibility from what @superdump mentioned on discord.

This is only relevant if rendering directly into the swapchain texture. The formats supported by the swapchain are more limited as they're specifically for outputting to a display. If rendering to an intermediate texture then the format support is much broader. The tonemapping / upscaling PR does allow for using an independent intermediate texture rather than rendering directly to the swapchain texture, so in some cases at least we could do this. Basically probably only the final swapchain texture might need to be an srgb format and we can use non-srgb in-between. We could even do gamma correction manually and not use an srgb format at all, though I'm not sure of exact swapchain texture format support across a variety of different backends / GPU vendors though so it may be that some/many only support srgb formats.

@superdump
Copy link
Contributor

superdump commented Aug 12, 2022

I asked Alex Vlachos about the license of the GDC 2015 code: https://twitter.com/AlexVlachos/status/1557662473775239175

No license. Zero patents. Originally written by Iestyn Bleasdale-Shepherd for Portal 2 with some minor tweaks for VR (original code was aimed at dithering Xbox 360). We shared it with the industry as part of our VR brain dump in 2015 at GDC. Free to use any way you want.

I've clarified in follow-up tweets that 'no license' doesn't mean public domain and linked to the CC0 license pages for information about that. However, they state "Free to use any way you want" and the intent is clear from that at least.

@aevyrie aevyrie marked this pull request as ready for review November 10, 2022 06:20
@aevyrie
Copy link
Member Author

aevyrie commented Nov 10, 2022

@mockersf and @cart, RE: discord conversation about getting this into 0.9, I've cleaned this up and got it working with and without hdr.

@superdump
Copy link
Contributor

I've reloaded this information into my mental cache and had a think through the current state of things.

SDR:

  • main pass f32 with tonemapping -> srgb u8 texture
  • Dithering should happen in the main pass on non-linear sRGB values, then be converted back to linear just before writing out

'HDR' (it's still SDR at the end as we don't have HDR output to the window yet, but it allows us to do post-processing on linear HDR before tonemapping):

  • main pass f32 -> linear f16 texture -> tonemapping f32 -> linear f16 texture -> upscaling f32 -> srgb u8 texture
  • Dithering should happen in the upscaling pass on non-linear sRGB values, then be converted back to linear just before writing out

I think this should be correct...

@superdump
Copy link
Contributor

superdump commented Nov 10, 2022

Nope, I forgot about upscaling in the SDR case:

  • main pass f32 with tonemapping -> srgb u8 texture -> upscaling f32 -> srgb u8 texture
  • Does dithering need to be done twice to compensate for both the intermediate quantisation errors and the upscaled errors? i.e. in both the main pass and upscaling pass? It definitely needs to be done in the main pass before the initial quantisation to u8. I'm not sure about the upscaling. Maybe it's not needed there.

@cart
Copy link
Member

cart commented Nov 10, 2022

This should be disable-able (probably on the existing Tonemapping settings, given that they are currently tied together). Disabling dither there should specialize the tonemapping pipeline to remove the dither code.

@cart cart added this to the Bevy 0.9 milestone Nov 10, 2022
@aevyrie
Copy link
Member Author

aevyrie commented Nov 10, 2022

I'm not super proficient with pipeline specialization, but I think I've completed this as requested. Dithering can be toggled, but requires tonemapping to be enabled. This is reflected in the new type signature of Tonemapping. I've also tested I can disable dithering in the 3d_scene example by modifying the Camera's Tonemapping component.

var dither = vec3<f32>(dot(vec2<f32>(171.0, 231.0), frag_coord)).xxx;
dither = fract(dither.rgb / vec3<f32>(103.0, 71.0, 97.0));
return (dither - 0.5) / 255.0;
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
}
}

@mockersf
Copy link
Member

just checking, as it's not what I understood from the discussion: should this dithering setting be ignored when HDR is enabled?

@cart
Copy link
Member

cart commented Nov 11, 2022

just checking, as it's not what I understood from the discussion: should this dithering setting be ignored when HDR is enabled?

Nope. Dithering should be coupled to tonemapping (at least for now, as they are logically coupled in the hdr tonemapping pass). If HDR is enabled, dithering should happen in the tonemapping pass. If HDR is disabled, it should happen in the PBR / sprite shader.

This impl doesn't quite line up with that, as we aren't specializing the tonemapping pipeline when dither is enabled, which means it is skipped when hdr is enabled.

@cart
Copy link
Member

cart commented Nov 11, 2022

Working on a fix now

@cart
Copy link
Member

cart commented Nov 11, 2022

Fixed! I also made a personal style call by renaming is_deband_dither_enabled to deband_dither. I like preferring the shorter names for bools when the word is already a verb. I think they read better by nature of having less grammar to mentally parse.

@aevyrie
Copy link
Member Author

aevyrie commented Nov 11, 2022

Fixed! I also made a personal style call by renaming is_deband_dither_enabled to deband_dither. I like preferring the shorter names for bools when the word is already a verb. I think they read better by nature of having less grammar to mentally parse.

Thanks. I agree with the rename, I had the same initially but chose to follow the preexisting is_ style. 🙂

Copy link
Contributor

@superdump superdump left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’m marking this as approved assuming it is all tested in its current state as I only had a couple of nit comments.

crates/bevy_core_pipeline/src/tonemapping/tonemapping.wgsl Outdated Show resolved Hide resolved
Copy link
Member

@mockersf mockersf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

works as I expect it now 👍

@alice-i-cecile alice-i-cecile added the S-Ready-For-Final-Review This PR has been approved by the community. It's ready for a maintainer to consider merging it label Nov 11, 2022
@cart
Copy link
Member

cart commented Nov 11, 2022

bors r+

bors bot pushed a commit that referenced this pull request Nov 11, 2022
# Objective

- Closes #5262 
- Fix color banding caused by quantization.

## Solution

- Adds dithering to the tonemapping node from #3425.
- This is inspired by Godot's default "debanding" shader: https://gist.github.com/belzecue/
- Unlike Godot:
  - debanding happens after tonemapping. My understanding is that this is preferred, because we are running the debanding at the last moment before quantization (`[f32, f32, f32, f32]` -> `f32`). This ensures we aren't biasing the dithering strength by applying it in a different (linear) color space.
  - This code instead uses and reference the origin source, Valve at GDC 2015

![Screenshot from 2022-11-10 13-44-46](https://user-images.githubusercontent.com/2632925/201218880-70f4cdab-a1ed-44de-a88c-8759e77197f1.png)
![Screenshot from 2022-11-10 13-41-11](https://user-images.githubusercontent.com/2632925/201218883-72393352-b162-41da-88bb-6e54a1e26853.png)


## Additional Notes 

Real time rendering to standard dynamic range outputs is limited to 8 bits of depth per color channel. Internally we keep everything in full 32-bit precision (`vec4<f32>`) inside passes and 16-bit between passes until the image is ready to be displayed, at which point the GPU implicitly converts our `vec4<f32>` into a single 32bit value per pixel, with each channel (rgba) getting 8 of those 32 bits.

### The Problem

8 bits of color depth is simply not enough precision to make each step invisible - we only have 256 values per channel! Human vision can perceive steps in luma to about 14 bits of precision. When drawing a very slight gradient, the transition between steps become visible because with a gradient, neighboring pixels will all jump to the next "step" of precision at the same time.

### The Solution

One solution is to simply output in HDR - more bits of color data means the transition between bands will become smaller. However, not everyone has hardware that supports 10+ bit color depth. Additionally, 10 bit color doesn't even fully solve the issue, banding will result in coherent bands on shallow gradients, but the steps will be harder to perceive.

The solution in this PR adds noise to the signal before it is "quantized" or resampled from 32 to 8 bits. Done naively, it's easy to add unneeded noise to the image. To ensure dithering is correct and absolutely minimal, noise is adding *within* one step of the output color depth. When converting from the 32bit to 8bit signal, the value is rounded to the nearest 8 bit value (0 - 255). Banding occurs around the transition from one value to the next, let's say from 50-51. Dithering will never add more than +/-0.5 bits of noise, so the pixels near this transition might round to 50 instead of 51 but will never round more than one step. This means that the output image won't have excess variance:
  - in a gradient from 49 to 51, there will be a step between each band at 49, 50, and 51.
  - Done correctly, the modified image of this gradient will never have a adjacent pixels more than one step (0-255) from each other.
  - I.e. when scanning across the gradient you should expect to see:
```
                  |-band-| |-band-| |-band-|
Baseline:         49 49 49 50 50 50 51 51 51
Dithered:         49 50 49 50 50 51 50 51 51
Dithered (wrong): 49 50 51 49 50 51 49 51 50
```

![Screenshot from 2022-11-10 14-12-36](https://user-images.githubusercontent.com/2632925/201219075-ab3f46be-d4e9-4869-b66b-a92e1706f49e.png)
![Screenshot from 2022-11-10 14-11-48](https://user-images.githubusercontent.com/2632925/201219079-ec5d2add-817d-487a-8fc1-84569c9cda73.png)




You can see from above how correct dithering "fuzzes" the transition between bands to reduce distinct steps in color, without adding excess noise.

### HDR

The previous section (and this PR) assumes the final output is to an 8-bit texture, however this is not always the case. When Bevy adds HDR support, the dithering code will need to take the per-channel depth into account instead of assuming it to be 0-255. Edit: I talked with Rob about this and it seems like the current solution is okay. We may need to revisit once we have actual HDR final image output.

---

## Changelog

### Added

- All pipelines now support deband dithering. This is enabled by default in 3D, and can be toggled in the `Tonemapping` component in camera bundles. Banding is a graphical artifact created when the rendered image is crunched from high precision (f32 per color channel) down to the final output (u8 per channel in SDR). This results in subtle gradients becoming blocky due to the reduced color precision. Deband dithering applies a small amount of noise to the signal before it is "crunched", which breaks up the hard edges of blocks (bands) of color. Note that this does not add excess noise to the image, as the amount of noise is less than a single step of a color channel - just enough to break up the transition between color blocks in a gradient.


Co-authored-by: Carter Anderson <mcanders1@gmail.com>
@bors bors bot changed the title Fix color banding by dithering image before quantization [Merged by Bors] - Fix color banding by dithering image before quantization Nov 11, 2022
@bors bors bot closed this Nov 11, 2022
ItsDoot pushed a commit to ItsDoot/bevy that referenced this pull request Feb 1, 2023
…5264)

# Objective

- Closes bevyengine#5262 
- Fix color banding caused by quantization.

## Solution

- Adds dithering to the tonemapping node from bevyengine#3425.
- This is inspired by Godot's default "debanding" shader: https://gist.github.com/belzecue/
- Unlike Godot:
  - debanding happens after tonemapping. My understanding is that this is preferred, because we are running the debanding at the last moment before quantization (`[f32, f32, f32, f32]` -> `f32`). This ensures we aren't biasing the dithering strength by applying it in a different (linear) color space.
  - This code instead uses and reference the origin source, Valve at GDC 2015

![Screenshot from 2022-11-10 13-44-46](https://user-images.githubusercontent.com/2632925/201218880-70f4cdab-a1ed-44de-a88c-8759e77197f1.png)
![Screenshot from 2022-11-10 13-41-11](https://user-images.githubusercontent.com/2632925/201218883-72393352-b162-41da-88bb-6e54a1e26853.png)


## Additional Notes 

Real time rendering to standard dynamic range outputs is limited to 8 bits of depth per color channel. Internally we keep everything in full 32-bit precision (`vec4<f32>`) inside passes and 16-bit between passes until the image is ready to be displayed, at which point the GPU implicitly converts our `vec4<f32>` into a single 32bit value per pixel, with each channel (rgba) getting 8 of those 32 bits.

### The Problem

8 bits of color depth is simply not enough precision to make each step invisible - we only have 256 values per channel! Human vision can perceive steps in luma to about 14 bits of precision. When drawing a very slight gradient, the transition between steps become visible because with a gradient, neighboring pixels will all jump to the next "step" of precision at the same time.

### The Solution

One solution is to simply output in HDR - more bits of color data means the transition between bands will become smaller. However, not everyone has hardware that supports 10+ bit color depth. Additionally, 10 bit color doesn't even fully solve the issue, banding will result in coherent bands on shallow gradients, but the steps will be harder to perceive.

The solution in this PR adds noise to the signal before it is "quantized" or resampled from 32 to 8 bits. Done naively, it's easy to add unneeded noise to the image. To ensure dithering is correct and absolutely minimal, noise is adding *within* one step of the output color depth. When converting from the 32bit to 8bit signal, the value is rounded to the nearest 8 bit value (0 - 255). Banding occurs around the transition from one value to the next, let's say from 50-51. Dithering will never add more than +/-0.5 bits of noise, so the pixels near this transition might round to 50 instead of 51 but will never round more than one step. This means that the output image won't have excess variance:
  - in a gradient from 49 to 51, there will be a step between each band at 49, 50, and 51.
  - Done correctly, the modified image of this gradient will never have a adjacent pixels more than one step (0-255) from each other.
  - I.e. when scanning across the gradient you should expect to see:
```
                  |-band-| |-band-| |-band-|
Baseline:         49 49 49 50 50 50 51 51 51
Dithered:         49 50 49 50 50 51 50 51 51
Dithered (wrong): 49 50 51 49 50 51 49 51 50
```

![Screenshot from 2022-11-10 14-12-36](https://user-images.githubusercontent.com/2632925/201219075-ab3f46be-d4e9-4869-b66b-a92e1706f49e.png)
![Screenshot from 2022-11-10 14-11-48](https://user-images.githubusercontent.com/2632925/201219079-ec5d2add-817d-487a-8fc1-84569c9cda73.png)




You can see from above how correct dithering "fuzzes" the transition between bands to reduce distinct steps in color, without adding excess noise.

### HDR

The previous section (and this PR) assumes the final output is to an 8-bit texture, however this is not always the case. When Bevy adds HDR support, the dithering code will need to take the per-channel depth into account instead of assuming it to be 0-255. Edit: I talked with Rob about this and it seems like the current solution is okay. We may need to revisit once we have actual HDR final image output.

---

## Changelog

### Added

- All pipelines now support deband dithering. This is enabled by default in 3D, and can be toggled in the `Tonemapping` component in camera bundles. Banding is a graphical artifact created when the rendered image is crunched from high precision (f32 per color channel) down to the final output (u8 per channel in SDR). This results in subtle gradients becoming blocky due to the reduced color precision. Deband dithering applies a small amount of noise to the signal before it is "crunched", which breaks up the hard edges of blocks (bands) of color. Note that this does not add excess noise to the image, as the amount of noise is less than a single step of a color channel - just enough to break up the transition between color blocks in a gradient.


Co-authored-by: Carter Anderson <mcanders1@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
A-Rendering Drawing game state to the screen C-Bug An unexpected or incorrect behavior S-Ready-For-Final-Review This PR has been approved by the community. It's ready for a maintainer to consider merging it
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Color Banding
8 participants