I am simulating a robotics application with a camera in Isaac Sim. The real camera on my robot has 15 FPS rate, which means, I want to create a camera sensor in Isaac Sim that will spit out new frames 15 times per second.
This is the camera, along with the camera view that I created on my scene
Now in order to work with this camera, I wrote a Python extension. My robot app, processes the frames in real time, which means I want to grab them as soon as they are rendered without any delay. Here is the simplified code of the extension, to demonstrate my problem:
class CameraExtension(omni.ext.IExt):
def on_startup(self, ext_id):
print("camera extension startup")
self.__camera_reader = CameraReader()
self.__camera_reader.start()
def on_shutdown(self):
print("camera extension shutdown")
self.__camera_reader.stop()
class CameraReader(threading.Thread):
def __init__(self):
super().__init__()
# The weird thing here is that since I specify "frequency=15",
# I expect for this camera to provide exactly 15 rgb images per second,
# however it seems this parameter is ignored and the camera provides more frames per second,
# (it seems it partially correlates with amount of physics steps, not quite sure though)
self.__camera = omni.isaac.sensor.Camera(prim_path="/World/my_robot/arm_camera/camera", resolution=(640, 480), frequency=15)
self.__camera.initialize()
self.__camera.add_motion_vectors_to_frame()
def run(self):
frame_number = 0
while True:
frame_number += 1
# As you can see I grab a new frame manually 15 times per second,
# but I have a robotics app, that processes video feed in real time,
# it requires very low latency, so I want to grab a new frame as soon as it is available,
# basically subscribing to some kind of event that new camera frame is available
frame = self.__camera.get_rgb()
if len(frame) > 0:
img = cv2.cvtColor(frame, cv2.COLOR_RGB2BGR)
# For illustration I'm just writing to the file here, in reality I send it through ZeroMQ to my app
cv2.imwrite(f"some_path/images/{frame_number}.jpg",img)
# Currently I'm sleeping for 67 milliseconds, which amounts to approximately 15 FPS rate.
# However even if I slept for shorter period of time, the new frames would be available,
# even though I specified frequency=15 for my Camera
time.sleep(0.067)
def stop(self):
self.__camera.pause()
As I pointed out there are two problems that I have:
- The frames are not rendered 15 times per second, for some reason even though I passed
frequency=15
parameter to my Camera class, if I run the simulation and just read the frames continiously in the loop I can see that there are many more unique pictures per second, meaning that the actual FPS is not 15. Is there a way to somehow make it exactly 15, to perfectly simulate my real camera? - How can I read the frames in real time, meaning as soon as the new frame for the camera is rendered, I want to be able to read it immediately, with 0 delay? Maybe there is some event I can subscribe to?
Aslo for the record I tried to subscribe to physics steps and render steps, for example,
self.__physx_sub = omni.physx.acquire_physx_interface().subscribe_physics_step_events(self.__on_physics_step)
self.__render_sub = omni.kit.app.get_app().get_update_event_stream().create_subscription_to_pop(self.__on_render_step)
But it seems these do not perfectly coincide with the actual camera frames.