Batch image processing (bump maps generating)

Hello! I need to create bump maps for many textures (over a thousand). Is there batch processing of files? The input is a folder with images, through a settings preset, and the output is a bump map with a file name and postfix.

Hi @omg1024! At the moment, the Texture Tools Exporter has pretty limited batch processing – the --batch option will take a list of command lines without the nvtt_export.exe prefix, and run them all in sequence. We’ve got a longer description of this here!

For large tasks, I usually wind up creating a short Python script to do large batch conversions. Here’s an example one where you can run py -3 path/to/this_script.py input_folder output_folder preset_file:

import os
import subprocess
import sys

# Get arguments
inDirName = sys.argv[1]
outDirName = sys.argv[2]
presetFileName = sys.argv[3]

print('Creating batch script')

# Start a new script
scriptFileName = 'temporary-batch.nvtt'
with open(scriptFileName, 'w') as outScript:
  # For each file in the directory...
  for file in os.listdir(inDirName):
    # You'll need to modify this line to add the postfix you want
    # - for the moment, I just have it replace the file extension with .dds
    # and place the file in the new folder:
    newFileName = os.path.join(outDirName, file.rpartition('.')[0] + '.dds')
    outScript.write(f'{file} --preset {presetFileName} --output {newFileName}\n')

print(f'Running nvtt_export --batch {scriptFileName}...')
subprocess.run([r'C:\Program Files\NVIDIA Corporation\NVIDIA Texture Tools\nvtt_export.exe',
                '--batch', scriptFileName])
os.remove(scriptFileName) # Clean up the temporary script
print('Done')

It’ll also probably be faster to use Python’s joblib to run multiple nvtt_export instances at once, which is how Portal with RTX compressed its new textures – there were ~1800 of those, so it’s a similarly sized problem. I can put together an example script for that if you’d like!

1 Like