I am currently using Isaac Sim and Agent SDG to generate agents that walk in a defined space. I can generate them, when I preview everything looks ok. When I render using Real Time (even with 2x 3090 24gb I cannot afford PathTracing), My images are very blurry due to the agents moving.
Unfortunately, I cannot use the python API as using replicator + SDG agent forces Omniverse to crash all the time.
If you look at the agents, they do not really look super realistic …
You can see that there is motion blur or some kind of effect regarding the movement of the agent. I would like to know what settings I need to use in order to prevent that effect from happening.
Try adding this in your script (after relevant imports):
# See known issues: https://docs.omniverse.nvidia.com/prod_extensions/prod_extensions/ext_replicator.html#known-issues
rep.settings.carb_settings("/omni/replicator/RTSubframes", 4)```
As you can see, I still have the same issues. I tried increasing the value and it is still not solving the problem.
I am not familiar with SDG agent python api, so I used the script editor instead.
If this is not the correct way of using your code, could you please provide a simple example integrating SDG agent ? Thanks a lot.
Hi, thank you for the clarification.
I did not know how to use SDG agent extension with omni.replicator so I decided to add a simple replicator script and run it. Before starting the data recording process using "Synthetic Data Recorder instead of SDG agent UI button, I click play to animate the characters.
Finally, I was able to
animate the character
Randomize the scene
influence image quality using rt_subframes of 32
I avoided the Agent SDG UI to generate the data.
This is the output
@rchadha, thank you for your help. Please, if you have any code sample that uses both the agent animation and replicator randomization, I would love to see if there is a better to achieve my results.