Nice tools - couple of questions for navigation

Hi, the isaac sim tools look pretty awesome, I have a couple of questions.

  • is there a way to run this simulator with a real-time factor like gazebo? If I need a certain image quality, I would be happy to run all processes at half or quarter speed just so i can guarantee the same fps while physics are also slowed down. This does not need to be automatic like in gazebo, I would be happy to set this to a fixed value.
  • stereo camera: I read in another post that the next release which should be available this month will be coming with stereo camera support. This is a major requirement for my research, so it would be great to know if this is still the plan
  • level of detail: For larger environments, game engines reduce the object quality depending on how close you are. Is this an available or planned feature?
  • IMU support. For navigation an IMU is an important sensor. Not as important as a camera, but this would also be handy to have
1 Like

Hi Phil,

Stereo camera is supported in our next release coming out June 10th.
We do currently support IMU in Isaac Sim, but as the ground truth of pose, if you are looking for the device level imu with noise and everything, that will be available towards end of summer.

Thank you for the update, i am looking forward to the stereo camera update!

@phil I am assuming that you are using ROS because you mention Gazebo :)
we have some new samples which might be useful:
https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/sample_ros_stereo.html
https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/sample_ros_nav.html

By default the simulator runs in a realtime mode, but if the performance drops too far, things will get less stable. We will look into how best to document this, when running in non-ui mode (pure python) you can control the step size (for rendering and physics) exactly https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/sample_python_basic.html#time-stepping but we don’t have a similar sample with ROS yet.

1 Like

Yes, I am running ROS, as all our other systems are running ROS which makes ROS support a primary requirement.
thank you for the new samples, the new stereo camera setup is nice, I just tested it today.
Your answer covers some of my questions from this post too: ROS camera framerate
Is it possible to have different frame rates on different topics with the pure python mode?

That isn’t currently exposed in the API but I will make a note of it in our feature tracker (likely for the next release)

We experimented with this idea for the Isaac SDK bridge where you would leave all of the USD components in a disabled state and then each frame selectively “tick” a component to make it publish. This way you could have one component publish camera images ever frame and lidar every other frame etc.

There is no reason to not also have this capability on the ROS side. Will look into adding it.

cool, looking forward to a python-ros sample then.
While i have your attention, the current stereo sample also needs a fix for the camera_info topic. Stereo cameras, have their baseline also defined in the camera_info, in the Tx value (P[3]) of the right camera. The details on how to calculate this can be found here: sensor_msgs/CameraInfo Documentation

Thanks! Will work on getting this fixed in the next bugfix release.
It will likely be exposed as an extra parameter on the ROS camera component where you can manually specify the Tx Ty value.

1 Like

@phil 2021.1.1 was released this week, and docs went live last night:

see https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/sample_python_ros.html
for how to run ROS from native python, added samples for running the stereo camera and a simpler clock.py sample for manual ticking and rospy usage.

Also added Tx,Ty (called stereoOffset) to the ROS camera component, but its not used in any samples yet.

1 Like

awesome, thank you for the quick turnaround!