Radiation value visualization


I am researching the digital twin of robotics using Omniverse.

I am planning to visualize the amount of radiation on Omniverse with colormap.

In the case of Gazebo, I know that it is possible to implement colormap through ROS, so I wonder if it is possible to implement a similar function in Omniverse.

This is the paper I referenced : [Robotics | Free Full-Text | Simulating Ionising Radiation in Gazebo for Robotic Nuclear Inspection Challenges]

Thank you.

Hello @swimpark! I moved your post to our IsaacSim forum so that the team can take a look at your questions. I also informed the development team. Thanks for reaching out to us!

Thanks for the reference, I’ll take a look at the paper and see what the implications and possibilities are for the Omniverse platform and the Isaac Sim application in particular.

On top of my head, i think you could try using the debug drawing with the top down view Debug Drawing Helper — Omniverse Robotics documentation

If you want blending, you can create a point instancer which references a sphere (or cube, mesh, usd prim, etc) and multiplies that sphere into many with different transforms. That one sphere can have the blending material. You can also have many point instancers, each points to a different colored sphere.

You can search for PointInstancer in our code base.

def _draw_instances(self):

    instancePath = "/occupancyMap/occupiedInstances"
    cubePath = "/occupancyMap/occupiedCube"
    pos_list = self._om.get_occupied_positions()
    scale = self.cell_size.model.get_value_as_float() * 0.5
    color = (0.0, 1.0, 1.0)
    stage = omni.usd.get_context().get_stage()
    if stage.GetPrimAtPath(instancePath):
    point_instancer = UsdGeom.PointInstancer(stage.DefinePrim(instancePath, "PointInstancer"))
    positions_attr = point_instancer.CreatePositionsAttr()
    if stage.GetPrimAtPath(cubePath):
    occupiedCube = UsdGeom.Cube(stage.DefinePrim(cubePath, "Cube"))
    occupiedCube.AddScaleOp().Set(Gf.Vec3d(1, 1, 1) * scale)

    proto_indices_attr = point_instancer.CreateProtoIndicesAttr()
    print("total points drawn: ", len(pos_list))
    proto_indices_attr.Set([0] * len(pos_list))


1 Like

Thanks for your kind reply.

I also wonder if I can add some sensors.

I want to observe the interaction between ionizing radiation value and my robot in Isaac sim environment.

For that, a customized sensor that can measure the radiation value along the distance of the radiation core is neccessary.

If there is no function to add or customize a sensor yet, is there a way to use it by changing the existing function?

Thank you !

If you know where the radiation core is, you can calculate the distance between 2 points yourself.
There’s also the raycast function that shoots rays and returns distances.

I wonder if you can use the semantic segmentation with the top down view or perspective for this as well. Depends on how you map the radiation value to a segmentation color. A depth camera can also be used to read the distances.

1 Like

Thanks for your kind reply.

Do you guys have any plan to add some functions such as sensing radiation or temperature?

If so, when are you planning to add it?

Hi swimpark,

It is part of our plan to add these temperature based sensors but not in the near future.