Creating a custom camera on Isaac Sim App

Hi,
We are trying to configure an RGB sensor in Isaac Sim to reflect the camera we are using in real life (Intel RealSense d455) and generate synthetic data.
However, the instructions are a bit confusing to us.
On the Isaac SDK documentation (Carter Warehouse example) you can add and configure a sensor by editing the carter_graph.json and carter_config.json

Also, on the Training Jetbot example by @HaiLocLu , he mentions:
“When we initially created the camera, we used default values for the FOV and simply angled it down at the road. This initial setup did not resemble the real camera image (Figure 12). We adjusted the FOV and orientation of the simulated camera (Figure 13) and added uniform random noise to the output during training. This was done to make the simulated camera view as much like the real camera view as possible.”

However, we are not able to figure out how to configure the RGB sensor to mimic our camera. We want to add an RGB sensor to mimic the Intel RealSense D455 camera which has the following specs:

  • RGB frame resolution: Up to 1280 × 720 [is this the cols/rows parameter?]
  • RGB frame rate: 30 fps [This has to be Frequency right?]
  • RGB sensor technology: Global ShutterRGB sensor [is it possible to configure this?]
  • FOV (H × V): 90 × 65° [there is an FOV parameter but not sure if this is the H or V one]
  • RGB sensor resolution: 1 MP. [ does this not matter since we already defined the resolution above?]

We are not sure this configuration is even possible using Isaac Sim App, the tutorials seem to point to Isaac Sim SDK.

Any help on how to configure this sensor is appreciated.

Hi mau,

Perhaps the get_camera_params may give you some info:

def get_camera_params(self, viewport):
        """Get active camera intrinsic and extrinsic parameters.

        Returns:
            A dict of the active camera's parameters.

            pose (numpy.ndarray): camera position in world coordinates,
            fov (float): horizontal field of view in radians
            focal_length (float)
            horizontal_aperture (float)
            view_projection_matrix (numpy.ndarray(dtype=float64, shape=(4, 4)))
            resolution (dict): resolution as a dict with 'width' and 'height'.
            clipping_range (tuple(float, float)): Near and Far clipping values.
        """
        stage = omni.usd.get_context().get_stage()
        prim = stage.GetPrimAtPath(viewport.get_active_camera())
        prim_tf = omni.usd.get_world_transform_matrix(prim)
        focal_length = prim.GetAttribute("focalLength").Get()
        horiz_aperture = prim.GetAttribute("horizontalAperture").Get()
        fov = 2 * math.atan(horiz_aperture / (2 * focal_length))
        width, height = viewport.get_texture_resolution()
        aspect_ratio = width / height
        near, far = prim.GetAttribute("clippingRange").Get()
        view_proj_mat = self.generic_helper_lib.get_view_proj_mat(prim, aspect_ratio, near, far)

        return {
            "pose": np.array(prim_tf),
            "fov": fov,
            "focal_length": focal_length,
            "horizontal_aperture": horiz_aperture,
            "view_projection_matrix": view_proj_mat,
            "resolution": {"width": width, "height": height},
            "clipping_range": (near, far),
        }
  • RGB frame resolution: Up to 1280 × 720 [since a camera is attached to a viewport, so it’s the viewport’s resolution]
    So when you create a new viewport for your camera, you can specify the res
viewport_handle_dofbot = omni.kit.viewport.get_viewport_interface().create_instance()
viewport_window_dofbot = omni.kit.viewport.get_viewport_interface().get_viewport_window(viewport_handle_dofbot)
viewport_window_dofbot.set_active_camera(prim_env_path + "/link4/Camera")
viewport_window_dofbot.set_window_pos(720, 0)

# match resolution of physical dofbot camera
viewport_window_dofbot.set_window_size(640, 480)

More info Python Utility Snippets — Omniverse Robotics documentation

Or specify them in the UI with the top left cog, Render Resolution.

Paul did a great tutorial on Cameras here too Cameras — Omniverse Create documentation