The offline_generation.py
demo shows how to run a short physics simulation to simulate falling objects onto a pallet, and then generate randomized scenes of boxes/cones scattered on another pallet. Here, the rep.trigger.on_frame()
context is used to generate scenes of the scattered boxes with different poses on every frame, BUT the poses of the boxes that were dropped onto the pallet do not change.
I want to run a new short simulation on every frame to also generate scenes where the poses of the objects dropped onto the pallet are different on every frame. How can I do this?
I tried modifying offline_generation.py
so that it moves the simulate_falling_objects()
function call into the rep.trigger.on_frame
context, but this function still seems to only be called once, as the poses of the dropped boxes are identical for all 10 frames.
with rep.trigger.on_frame(num_frames=CONFIG["num_frames"]):
rep.randomizer.scatter_boxes()
rep.randomizer.place_cones()
simulate_falling_objects(forklift_prim)
rep.randomizer.randomize_lights()
....
I’m using offline_generation.py
to easily illustrate my issue. In my actual application, I want to generate a synthetic datataset of random objects dropped from different poses, but only want to record the data AFTER the object has landed/settled onto a surface (hence why I need the short simulation for every scene).
Please advise, thanks!