Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle remaining types for property textures in custom shaders #10248

Open
ptrgags opened this issue Mar 30, 2022 · 1 comment
Open

Handle remaining types for property textures in custom shaders #10248

ptrgags opened this issue Mar 30, 2022 · 1 comment

Comments

@ptrgags
Copy link
Contributor

ptrgags commented Mar 30, 2022

in #10247, I made an initial property textures to cover the common cases of UINT8-based types. However, the EXT_structural_metadata spec allows other types including, but not limited to, the following:

  • integer types spread across multiple channels. e.g. a UINT16 stored as the RG channels of a texture. There are some gotchas here, e.g. in WebGL 1, an int is represented as a float and there's no uint type, so it's not possible to represent a UINT32 without precision loss. See also czm_unpackUint()
  • FLOAT32 encoded in little-endian as RGBA. See also czm_unpackFloat().
  • BOOLEAN types. The spec needs some revisions about the format, but from talking with @lilleyse yesterday, it makes the most sense to store as a UINT8 value with either 0 - false or 255 - true. This way, if you do bool(texture2D(...).r), you'll get the correct value.
  • ENUM types.
  • other weird corner cases that fit in 4 bytes (hopefully we don't have to support all of these?):
    • MAT2 of UINT8 (technically fits and would probably be column major?)
    • array of 2 VEC2 of UINT8

A lot of the details will be similar to #9572, but since textures are a more restrictive format, there are some subtle differences, hence the separate issue.

CC @lilleyse @IanLilleyT

@javagl
Copy link
Contributor

javagl commented Jan 10, 2024

People have been trying to use property- and feature ID textures that consist of 16bit (grayscale) PNG files. There are some open questions on the level of the specification (low-level, subtle questions...). These are tracked in CesiumGS/3d-tiles#748

I created a basic test case here. There is a 16bit grayscale PNG, with a size of 256x256, containing values in [0, 256*256). This is used for two properties:

  • examplePropertyA is a UINT16 that uses channels: [ 0 ]. So it should just represent these values in (0, 65536)
  • examplePropertyB is a UINT32 that uses channels: [ 0, 0 ]. So it should represent values in [0, 2^32)
    • (Whether or not that should be valid? How to know that the values have to be shifted by 16 bits before OR-ing them together? These are the open questions...)

The following archive contains the data and a sandcastle for testing, that allows selecting one property to be visualized:

propertyTextures16bit 2024-01-10.zip

Currently, selecting one of the properties causes the ("expected") error

RuntimeError: Fragment shader failed to compile.
Compile log: ERROR: 0:3: 'examplePropertyA' : no such field in structure

This is because the properties are filtered out of the structures in the MetadataPipelineStage, based on the check in isGpuCompatible that expects the componentType to be UINT8.

Pragmatically returning true for UINT16 causes it to return ""something"", but the details are really subtle. One can probably not even expect 16bit channel values to survive the transport through the shader without further precautions. At least, the line
metadata.examplePropertyA = int(255.0 * texture(u_propertyTexture_0, attributes.texCoord_0).r);
that is generated for the shader will have to be adjusted, depending on knowledge about the fact that this is a texture with a single 16bit channel...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants