-
Notifications
You must be signed in to change notification settings - Fork 267
How to Deploy ETC1S Texture Content Using Basis Universal
Note: This content was written before the UASTC texture format was added to the system, so it only focuses on deploying ETC1S content. UASTC is significantly higher quality than ETC1S, so some of the advice below doesn't apply. Whenever possible, for UASTC textures you should use ASTC or BC7.
First, become familiar with the exact compressed texture formats your device hardware and rendering API support. Just because your device supports a particular format (like PVRTC2) doesn't mean your OS or API does (iOS doesn't support PVRTC2, even though the hardware did). On Android, ETC1/2 are popular. iOS supports PVRTC1 (pretty much always) and possibly ETC1/2 (but don't bet on it), and on desktop BC1-5/BC7 are king. ASTC is very popular on Android and newer iOS devices, but not popular on desktop.
Also, become familiar with any texture size restrictions. For example, on iOS, you can only use square power of 2 texture dimensions for PVRTC1, and there's nothing Basis can do for you today that works around this limitation. (We will be eventually supporting the ability to trancode smaller non-pow2 textures into larger power of 2 PVRTC1 textures, or to resize to square power of 2 textures on the fly.)
The primary issues that trip up mobile native/WebGL app developers: Older ETC1-only devices, which require some sort of fallback to handle alpha textures. Also, PVRTC1's requirement for square (on iOS) power of 2 texture dimensions (Android/iOS), and PVRTC1's unique artifacts compared to all the other formats can also cause headaches.
ETC2 EAC RGBA and ASTC work around these issues, but these formats are still not available everywhere yet (especially WebGL on iOS, which still only supports PVRTC1 even on hardware that supports ETC1/2 or ASTC). Unfortunately PVRTC2 (which we do support when transcoding ETC1S format textures) was never supported on iOS, even on hardware that could handle it, so for all practical purposes it's basically useless.
Here are the major texturing scenarios the system supports:
-
For color-only textures, you can transcode to whatever format your target device supports. Remember that PVRTC1 requires square power of 2 size textures, and there's nothing Basis can currently do to help you work around this limitation. (Basis supports non-square PVRTC1 textures, but iOS doesn't.) For devices which support both ASTC and PVRTC1, ASTC will be much higher quality. For the few Android devices supporting both PVRTC2 and ASTC, for most opaque textures you can probably use PVRTC2 which will conserve memory.
-
For alpha textures, you can create .basis/.KTX2 files with alpha channels. To do this with the basisu compressor, either create 32-bit PNG files with alpha, or use two PNG files with the
-alpha_file
command line option to specify where the alpha data should come from. (For texture arrays, you can use multiple-file
and-alpha_file
command line options. Mipmap generation automatically supports alpha channels.)
Now deploy alpha content like this:
ETC1-only devices/API's: Transcode to two ETC1 textures and sample and recombine them in a shader. You can either use one ETC1 texture that's twice as high/wide, or two separate ETC1 textures. Alternately you can transcode to a single 4444 or 8888 texture.
ETC2 devices/API's: Just transcode to ETC2 EAC RGBA. ETC2 EAC's alpha quality is similar to BC3, and very high.
PVRTC1 devices/API's: Use a single PVRTC1 RGBA texture. For more complex alpha channels, transcode to two PVRTC1 4bpp textures, and sample twice. The PVRTC1 encoder is a real-time encoder, so you'll need to evaluate it on your texture/image data. If the alpha data is too complex or decorrelated both RGB and A quality will seriously suffer. (Sorry - PVRTC1 is an unforgiving format.)
Devices/API's supporting only BC1-5: Use BC3, which the transcoder supports. BC3's quality is very high.
Newer devices supporting BC7: Transcode to BC7, which supports a high-quality alpha channel. Quality will be similar to BC3 for ETC1S.
Devices/API's supporting ASTC: Just transcode to ASTC, which supports a variety of internal block encodings that will be automatically chosen by the transcoder for every block: L, LA, RGB, RGBA. If the device supports both PVRTC1/2 and ASTC, ASTC 4x4 will give you more reliable and much higher quality than PVRTC1/2, but it uses twice as much RAM (8bpp vs 4bpp).
Device's/API's supprting ATC: Transcode to ATC_RGBA_INTERPOLATED_ALPHA. This format is basically equivalent to BC3.
Device's/API's supporting PVRTC2: The real-time PVRTC2 RGBA transcoder can only handle simple opacity maps. You'll need to experiment to see if it's high enough quality. For devices which support both PVRTC2 and ASTC, ASTC 4x4 is preferable for alpha content although it will require 2x as much memory.
- For high quality tangent space normal maps, here's one suggested solution that should work well:
Compress with the -normal_map
flag, which disables a lot of stuff that has interfered with normal maps in the past. Also compress with -comp_level 2-4
, which creates the highest quality codebooks. Create ETC1S files with larger than normal codebooks by manually specifying the -max_endpoints
and -max_selectors
command line options.
Start with 2 component normalized XY tangent space normal maps (where XY range from [-1,1]) and encode them into two 8-bit channels (where XY is packed into [0,255]). Now put X in color, and Y in alpha, and compress that 32-bit PNG using basisu. The command line tool and encoder class support the option "-separate_rg_to_color_alpha" that swizzles 2 component RG normal maps to RRRG before compression, aiding this process.
ETC1 only devices/API's: Transcode to two ETC1 textures and sample them in a shader, or use an uncompressed format. You can either use one ETC1 texture that's twice as high/wide, or two separate ETC1 textures. The transcoder supports transcoding alpha slices to any color output format using a special flag: basist::basisu_transcoder::cDecodeFlagsTranscodeAlphaDataToOpaqueFormats
. This should work well because each channel gets its own endpoints and selectors.
ETC2 devices/API's: Transcode to a single ETC2 EAC RGBA or a ETC2 EAC RG11 texture, sample once in shader. This should look great.
PVRTC1 devices/API's: Transcode to two PVRTC1 opaque textures (RGB to one, A to another, which the transcoder supports using the cDecodeFlagsTranscodeAlphaDataToOpaqueFormats
flag) and sample each in the shader. This should look fairly good. Its doubtful the PVRTC1 RGBA transcoder could handle two complex channels of data well.
Devices/API's supporting BC1-5, BC6H, BC7: Transcode to a single BC5 texture (which used to be called "ATI 3DC"). BC5/3DC has two high quality BC4 blocks, so it'll look great. You could also use BC7, although BC5 will have slightly less error.
Devices/API's supporting ASTC: Just transcode to ASTC. The ASTC block transcoder will automatically encode to the "LA" format.
- Color-only .basis/.KTX2 files don't have alpha slices, so here's what currently happens when you transcode them to various texture formats (we are open to feedback or adding more options here):
BC3/DXT5 or ETC2 EAC: The color data gets transcoded to output color, as you would expect. You'll get all-255 blocks in the output alpha blocks, because the transcoder doesn't have any alpha slice data to convert to the output format. (Alternately, we could convert a single channel of the color data (like G) to output alpha, and assume the user will swizzle in the shader, which could provide a tiny gain in ETC1S conversion quality. But now output alpha would require special interpretation and we would need to invoke the block transcoders twice.)
BC4/DXT5A: This format is usually interpreted as holding single channel red-only data. We invoke the ETC1S->BC4 transcoder, which takes the red channel of the color slice (which we assume is grayscale, but doesn't have to be) and converts that to BC4/DXT5A blocks. (We could allow the user to select the source channel, if that is useful.)
BC5/3DC: This format has two BC4 blocks, and is usually used for XY (red/green) tangent space normal maps. The first block (output red/or X) will have the R channel of the color slice (which we assume is actually grayscale, but doesn't have to be), and the output green channel (or Y) will have all-255 blocks. We could support converting the first two color components of the color ETC1S texture slice to BC5, but doing so doesn't seem to have any practical benefits (just use BC1 or BC7). Alternately we could support allowing the user to select a source channel other than red.
Note that you can directly control exactly how transcoding works at the block level by calling a lower level API, basisu_transcoder::transcode_slice(). The higher level API (transcode_image_level
) uses this low-level API internally. find_slice()
and get_file_info()
return all the slice information you would need to call this lower level API. I would study transcode_image_level()
's implementation before using the slice API to get familiar with it.
- To get uncompressed 16/32-bpp pixel data directly from a slice, call the transcoder with one of the uncompressed pixel formats. This will likely be faster than transcoding to ETC1 or 2 then unpacking the blocks yourself (on the CPU). Using uncompressed images downsampled by a factor of 2 is a viable fallback instead of using PVRTC1. The transcoder supports unpacking to 32-bit, 565 RGB or BGR, and 4444 images.