I haven’t seen any examples using Isaac Gym or Sim to stimulate flying robots/drones, so I was wondering if this is easily possible currently? I don’t need any aerodynamics, just 4 controllable thrust actuators.
I think you could start prototyping by using the self._dc.apply_body_force on your drone body (could be just a flat box).
See the surface_gripper.py’s apply_body_force for examples.
I would definitely agree that you should look more at the Isaac Sim side for this, as @HaiLocLu is suggesting. Isaac Gym is more oriented around training RL models with physics interactions. For the most part, you want your drone to avoid physical interactions.
Actually I do want the drone to also interact physically with the environment. Is Gym more suited for this somehow?
Gym is suited to situations where you want to do reinforcement learning that involve physical interaction of thousands of agents at a time. For example, if you want to train a task like humanoid locomotion, or object manipulation, where a big part of the feedback to your agent involves physical contact information, Gym is tuned to be able to handle this very quickly. Gym provides APIs that allow this kind of feedback to be delivered within tensors that stay on the GPU, so learning algorithms can directly access them.
Unless you are trying to do something like training a drone to land on a complex surface, I’m not sure that you would need this for your use case.
I would think that the type of sensors you would care about more for a drone application would be cameras or lidars. While Gym has some support for basic 2D rendering, it would not be able to provide the level of visual fidelity you would get with Isaac Sim. Isaac Sim also supports lidar.
I see, thanks for the tips Gav!