I was going through the tutorials in the SDK docs about creating camera feeds and OpenCV. Then it came to the part where you are supposed to connect it to a “sim” camera coming from the Carter robot being simulated in Unity (I believe this is deprecated? to Omniverse?).
I succeed in getting the image sequence over a bridge to ROS RVIZ. But I would like some more specifics about how to use the SDK and how to setup Omniverse to stream to it.
Hi @clayton.allen.ks, the communication from the SDK to Unity and Omniverse versions is similar as they both use the TCP Publisher and Subscriber, you shouldn’t need any big modifications for most of the samples.
Here you can find some of the OV related examples which also include camera topics: Isaac Sim Built on Omniverse — ISAAC 2021.1 documentation
I have read this page a few times now and not all of the provided examples work as advertised. I was looking for a more succinct write up on how to code/edit files to make this possible. I feel like the documentation points everyone to running samples and not so much how to develop.
On the Omniverse side you need to:
SDK side you need to:
- Create an application graph where you have a TCPSubscriber node (to receive messages from SIM) and a TCPPublisher node (to send messages). The easiest example app is under sdk/packages/navsim/apps/navsim.app.json although in our example apps we often wrap this up in a simulation subgraph.
- If you run the navsim.app.json you could already connect to a running bridge app on the OV side. If you want to do something with the feed or view them on the SDK side you need to connect these channels to other components on the SDK app though.
- You should check the navsim apps as examples. For instance, in sdk/apps/navsim/navsim_navigate.app.json we have an example to navigate a robot where we get the camera feeds from simulation. You can see there is a simulation subgraph and the camera_viewer nodes. If you don’t need anything else you could remove most of the other nodes and edges
The channel names in the simulation.interface subgraph are given by what you have defined in the REB_Camera Proprieties. In this case, for the rgb camera we have by default this output/color for the RGB component of the camera:
- You could also just add the camera_viewer nodes and respective edges to the navsim.app.json mentioned in the first point. We tend to create subgraph such that we can have a better modular behavior and not have super long graph app files.
- Once you have your app setup you can run it as any other SDK app (bazel run…) and see the images in the viewer in Isaac Sight (localhost:3000)
@TeresaC can you please provide the code from the sdk application to run the described example?
Im trying to run it, but it doesnt work…,
Hier more Infos:
BUILD (731 Bytes)
navsim.app.json (1.6 KB)