Adding contact sensors to shadow hand's fingertips - OmniIsaacGymEnvs

Hi there,

I’m currently modifying the ShadowHand example provided in OmniIsaacGymEnvs in order to control the hand using efforts rather than position control, and I’d like to train RL models to achieve this using contact/force/tactile observations rather than “visual” ones (fingertip positions, angles, velocities, etc.) as done in the OpenAI examples for instance.


In order to simulate the pressure/tactile sensors of the real Shadow Hand, I want to attach a contact sensor (as defined here) to all 5 fingertips of all my hands (i.e. of all cloned envs), in order to get fingertip contact data at each step of my RL loop when manipulating an object.


In the set_up_scene method of my task class, I tried to create such sensors using the IsaacSensorCreateContactSensor command with self.default_zero_env_path + "/shadow_hand/robot0_xxdistal" as the prim path (xx being replaced by ff, mf, etc., i.e. the five fingers), and I then used the ContactSensorInterface in get_observations to get data from these sensors. I observed nothing but zero measurements (i.e. no contact and thus no force), so I tried to visualize my sensors in the simulator by setting the visualize flag to True when using the creation command, and I set the color parameter to some kind of red. Turns out the sensors are located way above the hands (in the screenshot below, see the red dots outlining the spherical sensors), and do not move although the fingers do, which is not wanted.

When setting use_flatcache to False in the config file of the task (similar to here), the hands appear standing vertically, and do not move anymore. However, the sensors’ location matches in this case the position of the fingertips’ distal joint of these vertically standing hands, as (not easily) visible in the screenshot below.

Could someone help me solving this problem? That is creating contact sensors attached to the fingertips (not to the joints, although this makes no sense to me anyway) and that move with these in simulation, allowing to get contact data from the interaction between these tips and the manipulated object.

Thank you very much for your help.

PS1: this topic is adapted from my GitHub issue.
PS2: I’m fairly new to Isaac Sim and Gym, so I may very well be completely wrong in my approach, don’t hesitate to tell me.

It turns out that I did not read this issue in its entirety, and did not run the CPU pipeline. Running this pipeline solves the issue, very sorry for that.

However, I thus have to loop over the cloned environments to get data for my sensors in my task. Is there a way to use sensors et get data from them in a vectorized way?

See the answer on GitHub.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.