Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support normalized 16-bit texture formats #1934

Closed
kvark opened this issue Sep 10, 2021 · 4 comments · Fixed by #2282
Closed

Support normalized 16-bit texture formats #1934

kvark opened this issue Sep 10, 2021 · 4 comments · Fixed by #2282
Labels
area: api Issues related to API surface type: enhancement New feature or request

Comments

@kvark
Copy link
Member

kvark commented Sep 10, 2021

Is your feature request related to a problem? Please describe.
It's often convenient to process the data in shaders as u16 and store it this way. However, reading it in the shader is problematic: currently we have no 16-bit normalized formats, so it has to be bound as integer. But integer textures can't be sampled. So using them is highly restricted today without the normalization.

Describe the solution you'd like
Exposed a set of 16-bit normalized formats gated by a feature.

Describe alternatives you've considered

Additional context
Requested on matrix.

@aloucks
Copy link
Contributor

aloucks commented Nov 12, 2021

I was looking into the details of this to see if I could implement it when I get some free time and noticed that the existing texture format features all have a TEXTURE_COMPRESSION_ prefix, rather than TEXTURE_FORMAT_, and the 16bit formats are not related to compression.

Would you be open to adjusting the naming of the existing texture format features? It would be nice if they all followed the same pattern, such as TEXTURE_FORMAT_{family}.

Currently the compressed formats have features like TEXTURE_COMPRESSION_ETC2 and TEXTURE_COMPRESSION_ASTC_LDR.

I'm suggesting changing these to allow for other (non-compression formats) to all follow the same convention.

e.g.

// Enables ASTC family of compressed textures ...
TEXTURE_FORMAT_ETC2

// Enables ASTC family of compressed textures ...
TEXTURE_FORMAT_ASTC_LDR

// Enable additional 16-bit texture formats ...
TEXTURE_FORMAT_16BIT

@kvark
Copy link
Member Author

kvark commented Nov 13, 2021

We should name the features the same way as WebGPU spec does it, unless there is a really strong reason to diverge.

aloucks added a commit to aloucks/wgpu that referenced this issue Dec 11, 2021
aloucks added a commit to aloucks/wgpu that referenced this issue Dec 12, 2021
aloucks added a commit to aloucks/wgpu that referenced this issue Dec 12, 2021
aloucks added a commit to aloucks/wgpu that referenced this issue Dec 13, 2021
kvark pushed a commit that referenced this issue Dec 13, 2021
* Add feature gated 16-bit normalized texture support

Fixes #1934

* Query format properties only once

* Prevent supports_format from erroneously reporting false if the format wasn't queried

* Assert that 16bit norm formats also support  on vulkan

* Add storage to TextureFormatInfo for 16-bit norm formats now that we check for support
@VincentJousse
Copy link
Contributor

@kvark Will this become a WebGPU feature any time soon ?

@kvark
Copy link
Member Author

kvark commented Apr 6, 2022

I'm not participating in this standard body any more.
Redirecting to @jimblandy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: api Issues related to API surface type: enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants