Hi,
Hope you are doing well and thank you for reaching out.
I have attached a quick example on how to render a USD scene and generating a PNG that will hopefully help you with tackling some of the other ideas you might have.
This uses the Rendering Service which in turn uses the Capture Extension within Kit.
The Capture Extension is also what is used by the embedded, but not scriptable, Movie Maker (under the menu Rendering → Movie Maker)
You can get the Rendering service extension from the Extension Manager (Window → Extension Manager, search for Rendering Service) once you have it installed and enabled you can use the following snippet from the Script Editor ( Window → Script Editor):
import asyncio
import json
from omni.services.core import main as _main
from omni.services import client
settings = """{
"usd_file": "omniverse://localhost/NVIDIA/Samples/Astronaut/Astronaut.usd",
"render_start_delay": 10,
"render_stage_load_timeout": 50,
"render_settings": {
"camera": "/Root/LongView",
"range_type": 0,
"capture_every_nth_frames": 1,
"fps": "24",
"start_frame": 1,
"end_frame": 1,
"start_time": 0,
"end_time": 0,
"res_width": 1920,
"res_height": 1080,
"render_preset": 0,
"debug_material_type": 0,
"spp_per_iteration": 0,
"path_trace_spp": 64,
"ptmb_subframes_per_frame": 5,
"ptmb_fso": 0,
"ptmb_fsc": 0.5,
"output_folder": "C:/render_output",
"file_name": "Capture",
"file_name_num_pattern": ".####",
"file_type": ".png",
"save_alpha": false,
"hdr_output": false
},
"load_async": true
}"""
def main():
render_settings = json.loads(settings)
_services = client.AsyncClient("local://", _main.get_app())
asyncio.ensure_future(_services.render.run(**render_settings))
main()
The above script will try and render the default Astronaut scene and use the /Root/LongView
Camera.
It will render frame 1 at 64 (spp_per_iteration) samples in Pathtraced mode (render_preset) and write it out to a Folder on the C drive. This can also be a path on Nucleus or Linux.
You can also run this from the commandline once you have installed the rendering service.
If you save the snippet to a file called do_render.py for example you can then run:
/omni.create.next.kit.bat (.sh for Linux) --enable omni.services.render --exec “do_render.py”
You can also use this service from outside Create (Create would need to be running but you can call into it from external applications).
If you were to browse to http://localhost:8011/docs
after you launch Create and after enabling the rendering service you will see that there is a /render/run endpoint that you can call directly from the web browser or from any other script that knows how to make a HTTP Post request and by passing it the settings dictionary.
ie:
import json
import requests
settings = """{
"usd_file": "omniverse://localhost/NVIDIA/Samples/Astronaut/Astronaut.usd",
"render_start_delay": 10,
"render_stage_load_timeout": 50,
"render_settings": {
"camera": "/Root/LongView",
"range_type": 0,
"capture_every_nth_frames": 1,
"fps": "24",
"start_frame": 1,
"end_frame": 1,
"start_time": 0,
"end_time": 0,
"res_width": 1920,
"res_height": 1080,
"render_preset": 0,
"debug_material_type": 0,
"spp_per_iteration": 0,
"path_trace_spp": 64,
"ptmb_subframes_per_frame": 5,
"ptmb_fso": 0,
"ptmb_fsc": 0.5,
"output_folder": "C:/render_output",
"file_name": "Capture",
"file_name_num_pattern": ".####",
"file_type": ".png",
"save_alpha": false,
"hdr_output": false
},
"load_async": true
}"""
def main():
render_settings = json.loads(settings)
# This will wait until the render is done:
resp = requests.post(http://localhost:8011/render/run, json=settings)
print(resp.status_code)
Hope that helps and please let me or @dfagnou know if you have any further questions.
Thanks,
Jozef