Get generic range sensor running

Hello everyone,

I started looking at Isaac Sim’s sensors using the sensor examples. While setting up the LIDAR works fine for me, I am running into problems with the Generic Range Sensor, even though it looks quite similar in most parts.

Creating the generic prim works fine, but afterwards it does not work. No line shows up and the data I get from the sensor is always the same, no matter if I move and rotate the sensor or put an object in front of it.

For context: I am trying to simulate a simple static optical sensor that should detect when an object passes by on a conveyor. The sensor should be visualised by a laser, like the lidar sensor.

Here is my code snippet for creating the generic. Getting the location might be a bit compilcated, but it worked for the lidar sensor. The function is called by a ui button at the moment.

def _create_generic(self):
# setup
stage = omni.usd.get_context().get_stage()
self.genericPath = “/World/Generic”

# delete generic instance if created earlyer
if get_prim_at_path(self.genericPath):
# Sensor should be created on predefined cube location in model
#I create a prim on genericPath on location to override it with the Gerneric
prim = stage.GetPrimAtPath("/World/roller_conveyor/Cube")
xformCache = UsdGeom.XformCache(Usd.TimeCode.Default())
transform = xformCache.GetLocalToWorldTransform(prim)
translation = transform.ExtractTranslation()
xformPrim = UsdGeom.Xform.Define(stage, self.genericPath)

# create sensor
self.generic = RangeSensorSchema.Generic.Define(stage, Sdf.Path(self.genericPath))



I then got the data with the function get_linear_depth_data() of _range_sensor.acquire_generic_sensor_interface().

Can anyone help me get the sensor working? Any help would be appreciated. Thank you!

Best regards,

I managed to get it partially working with more code snippets from the Isaac example:

self._editor_event_subscription = (

And the function defining and calling the needed sensor pattern:

def new_on_editor_step(self, step):
    if not self._timeline.is_playing():

    if self._timeline.is_playing():
        if self._generic:
            if self._pattern_set:
                if self._sensor.send_next_batch(
                ):  # send_next_batch will turn True if the sensor is running out data and needs more
                    print("sending more data")
                        self._genericPath, self.sensor_pattern
                print("sensor not added or pattern not set")
def _test_repeating_data(self):
    batch_size = 10
    #np.zeros for static Sensor (no movement)
    azimuth = np.zeros(batch_size)
    zenith = np.zeros(batch_size)
    sensor_pattern = np.stack((azimuth, zenith))

    origin_offsets = np.zeros((batch_size,3)) 
    return sensor_pattern, origin_offsets

With the above setup, I have encountered another problem. The sensor is “looking” in the negative x-direction, but I need it to look in the positive y-direction. Using the AddRotateXYZOp() property or editing the azimuth array doesn’t work as expected. The values I provide these functions seem unrelated to the actual outcome. The sensor just moves within an approximate 30-degree cone, regardless of my input. For example, using the Rotate property moves the “laser beam” clockwise until about a value of 90 degrees (just the value, not the actual rotation - somehow the values and the actual rotation don’t seem to correlate) and then moves back anticlockwise. You can see this in the image, where I rotated the z-value in the Rotation property by about 120 degrees, but the actual rotation is nowhere near that.

Do I miss something? Has anyone else faced this issue or has a suggestion on how to properly align the sensor in the desired direction? Any help would be greatly appreciated!

It must have been a bug. I recreated the Xform that I inserted during modelling for later positioning and rotated it by 90° from the start. Now the rotation works as expected.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.