Hi there,
I’m currently working on a fork of OmniIsaacGymEnvs in which I’d like to train RL models with the Shadow hand. Rather than using visual observations (like OpenAI did), I’d like to use “tactile” observations. To do that, I want to use basic contact sensors mounted on the fingertips. I kinda managed to create such sensors and to attach them to the fingertip prims (although they appear high above these in the simulator, but that’s another problem), but I would like to visualize them properly in simulation.
I use the omni.kit.commands.execute("IsaacSensorCreateContactSensor", ...)
command to create my sensors, and I specify a color
parameter and set the visualize
boolean parameter to True, in order to see my sensors in the simulator. They thus appear as a kind of sphere outlined by red dotted lines (see screenshot below, please zoom in to see the red dots), which is not really convenient in terms of visualization.
I’ve seen that there is an apply_visual_material
method in the ContactSensor
class, and I’d like to use that to apply a visual material to my sensors, hence improving their visibility.
However, as I interact with these sensors using the contact sensor interface, I cannot figure out how to access my sensor objects directly in order to use the method I mention. I tried using the sensor output from the IsaacSensorCreateContactSensor
command, but this output seems not to be the sensor object. I also tried to adapt this Python snippet but it didn’t work.
Could someone please help me and explain
- how to get direct access to my sensors to do more than just data reading?
- how to apply a visual material to the sensors to visualize them properly?
Thanks in advance!