Hi, I have implemented a nice bipedal walking rl program for a custom-built robot on isaacGYmn Preview 3 but now wanted to add an IMU (ideally MPU 6050) into the rl training using omniverse isaac sim.
I’ve got my robot up and running in Isaac sim now. I initially tried to add a contact sensor for the feet which wasn’t useful as I couldn’t find a way to implement the sensor into my rl pipeline across my 512 actors due to no way of making a tensorised view? so went with getting the information from rigid body net contact forces.
How do I implement the IMU across all the actors so it can be one of the observations for the rl learning?
Thanks Kelly
I will look at the rigid contact view implementation - is this a new feature of v2022.2? its hard to tell as there doent seem to be version specific documentation? But I solved this already as stated in my original post.
But how do i use the IMU for rl training data in a tensorised form so I can feed it into the observations for training for all my 512 robots? - that was my question
sincerely
Sujit
Hi @sujitvasanth - To use the IMU for RL training data in a tensorized form, you would need to extract the IMU sensor data from each robot and convert it into a tensor that can be used as input for your RL model. Here’s a general approach on how you can do this:
Extract IMU Data: For each robot, you would need to extract the IMU sensor data. In Isaac Sim, you can use the omni.isaac.sensor extension to simulate the IMU sensor and extract the linear acceleration and angular velocity of the robot.
Extract IMU Data: For each robot, you would need to extract the IMU sensor data. In Isaac Sim, you can use the omni.isaac.sensor extension to simulate the IMU sensor and extract the linear acceleration and angular velocity of the robot.
I think there may by a truncation in your response as point 1 and point 2 are the same please can you update??.. I have experimented and yes - can extract a single IMU data within rl as intended… but how to tensorize the entire 512 instances data to a single tensor all on gpu? could you show an example/gist of how to do this across all the scenes please? prefer to use pytorch
presumably the data is already all on the gpu so ?just needs a view?
Hi @sujitvasanth - Apologies. Here is the updated response:
Add IMU Sensor to Robots: First, you need to add an IMU sensor to each of your robots. In Isaac Sim, you can simulate the IMU sensor by extracting the linear acceleration and angular velocity of the body from the physics engine and apply local transformation of the sensor to simulate the sensor output via the omni.isaac.sensor extension.
Retrieve Sensor Readings : To retrieve readings from the added sensors, you can create an ArticulationView and use the get_force_sensor_forces() API, which will return a buffer of dimension (num_articulations, num_sensors_per_articulation, 6)
Tensorize the Data : Once you have the sensor readings, you can convert them into a tensor form suitable for feeding into your RL model. You can use libraries like NumPy or PyTorch for this. For example, if you’re using PyTorch
Feed into Observations for Training: Now that you have your sensor readings in tensor form, you can feed them into your RL model as observations for training. The exact way to do this will depend on the specifics of your RL model and training setup.
So I have already attached one of these ?IMU sensors to my robot usd asset in omniversw but wanted to know how to tensorise them so can be used in training data…please be specific on how to do this in pytorch?
Hi there, the IMU from the omni.isaac.sensor extension does not have tensorized support. For tensorized APIs, we provide the RigidContactView class for retrieving contact forces.
your simulated IMU seems more advanced than just taking the forces cting on a rigid body - thats important for later SIM to real implement of the resultant neural network