Important: Isaac Sim support
Note: For Isaac Sim support, the community is gradually transitioning from this forum to the Isaac Sim GitHub repository so that questions and issues can be tracked, searched, and resolved more efficiently in one place. Whenever possible, please create a GitHub Discussion or Issue there instead of starting a new forum topic.
Note: For any Isaac Lab topics, please submit your topic to its GitHub repo ( GitHub - isaac-sim/IsaacLab: Unified framework for robot learning built on NVIDIA Isaac Sim · GitHub ) following the instructions provided on Isaac Lab’s Contributing Guidelines ( Contribution Guidelines — Isaac Lab Documentation ).
Please provide all relevant details below before submitting your post. This will help the community provide more accurate and timely assistance. After submitting, you can check the appropriate boxes. Remember, you can always edit your post later to include additional information if needed.
6.0.0
5.1.0
5.0.0
4.5.0
4.2.0
4.1.0
4.0.0
4.5.0
2023.1.1
2023.1.0-hotfix.1
Other (please specify):
Operating System
Ubuntu 24.04
Ubuntu 22.04
Ubuntu 20.04
Windows 11
Windows 10
Other (please specify):
GPU Information
- Model: RTX 4090
- Driver Version: 580.126.09
Topic Description
Standalone Python: Is there any supported way to explicitly schedule or spread sensor rendering across frames?
Detailed Description
We are using Isaac Sim in standalone Python for a robotics simulation / SIL-style setup.
The main question is about the actual rendering / scheduling mechanism for sensors.
What we are trying to do:
-
keep a controlled loop period
-
maintain consistent timestamps between loops
-
spread sensor rendering load across frames when multiple sensors are active
What we expected:
-
some supported way to explicitly control which sensors are rendered on a given frame, or
-
some documented scheduling / load-sharing mechanism for sensor render products
What actually happened:
-
omni.kit.app.get_app_interface().update()appears to be the main path that drives rendering internally -
but we cannot find a documented way to explicitly schedule per-sensor rendering from standalone Python
-
we also cannot find documentation explaining how registered render products are internally scheduled and rendered
We are not asking about:
-
publisher throttling
-
dropping outputs after rendering
-
post-processing rate reduction
We are specifically asking about render scheduling / load sharing of explicit sensors.
Steps to Reproduce
-
Create a standalone Python Isaac Sim application
-
Run a loop similar to :
while simulation_app.is_running():
if not sim.is_playing():
sim.render()
continue
loop_start = time.perf_counter()
sim.step(render=False)
now_wall = time.perf_counter()
if now_wall >= next_render_wall:
sim.render()
next_render_wall = now_wall + render_wall_period
elapsed = time.perf_counter() - loop_start
sleep_time = TARGET_WALL_DT - elapsed - time_debt
if sleep_time > 0:
time.sleep(sleep_time)
time_debt = 0.0
else:
time_debt = abs(sleep_time)
time_debt = min(time_debt, TARGET_WALL_DT)
- Add multiple rendered sensors using render products
- Add multiple rendered sensors using render products
- Observe loop timing as sensor count increases
Error Messages
No direct Python error in the main case.
The issue is mainly lack of documentation / lack of visible supported control over per-sensor render scheduling.
Screenshots or Videos
(If applicable, add screenshots or links to videos that demonstrate the issue)
Additional Information
What I’ve Tried
- checked
omni.kit.appdocs - checked
isaacsim.simulation_appdocs - checked
omni.replicator.coredocs - checked render/update callback APIs
- checked Replicator render product / annotator / writer usage
- checked standalone camera and RTX lidar paths separately
I still could not find a clear answer to:
-
whether explicit per-sensor render scheduling is supported
-
how sensor render products are internally scheduled under
app.update()
Related Issues
(If you’re aware of any related issues or forum posts, please link them here)
Additional Context
If explicit per-sensor render scheduling is not supported, please confirm whether the intended model is simply:
-
all active sensor render products are handled by the main update/render path
-
and users can only control rates after rendering, not the render scheduling itself
If there is a supported mechanism, please point to the recommended API or design pattern.