I’m trying to compress some POD as BC6H. The standalone tool doesn’t really have any documentation that I can find. I need to be able to specify that I want signed data and I need to be able to work on a texture array of data. Are these things possible and is there even a command-line version that could be used to do automation with?
Hi tim1! I’m assuming you’re referring to the Texture Tools Exporter here (which runs on NVTT 3.0, which is the underlying library)?
The BC6H format in the standalone exporter is signed, luckily! (It maps to
GL_COMPRESSED_RGB_BPTC_SIGNED_FLOAT_ARB using a linear color space.) The standalone tool doesn’t support texture arrays at the moment (although the underlying library does), but if you’re adventurous, I think it might be possible to work around this by compressing the images individually, getting the image data from each DDS file, and creating a new DDS file with the given number of layers and data. (Note that the DDS file format implements cube maps using 6 texture array layers and various flags to indicate that it stores cubemap data.)
There is a command-line tool that you can use to do automation - if you have the standalone version, it automatically works from the command line! You can run
nvtt_export --help to get the full list of options, and the command line to convert an image to a BC6H SF16 DDS file from the command line is
nvtt_export.exe inputFileHere.png --format bc6 --output outputFileName.dds
or using the short form of the arguments,
nvtt_export.exe inputFileHere.png --f bc6 -o outputFileName.dds.
If you’re converting a lot of images this way, it might be more efficient to use the batch scripting mode - see the announcement post at https://news.developer.nvidia.com/texture-tools-exporter-2020-1/ for a brief overview.
(It’s also possible to automate this from Photoshop by recording an action.)
Hope this helps, and happy to answer further questions!
Thanks for replying. I was hoping to be able to do this in a little cleaner manner, but I can make this work at least do quality benchmarks the way you describe but ultimately a lib would certainly be preferred. Is the NVTT library available to use directly? I can The real hassle in this instance is I have to deconstruct the texture array, compress the individual images, then reconstruct the texture array before I can move on the next stage in my data build pipeline. The uncompressed version of each of these files may be 400+MB. So, that is more file traffic than is practical for our workloads in production.