Does Omniverse have a better solution for this particular problem than Unity3D?

We’re trying to build something we know we can do in Unity, but not sure about Omniverse, and our contact requests are going unanswered through the nVidia Contact Form.

We want to stream JSON data through some I/O interface (maybe ROS2, maybe Kafka, maybe Omniverse XXX), map the JSON data to objects (home objects) in a VR type simulated environment, and then run various models or analyses within the VR environment on the VR representations of the real world data. (Possibly known as 3D scene reconstruction). If a particular event type happens in the VR environment, we want to send a web server POST request to a remote server, again via our I/O.

We then want to take this and deploy the simulation capability and the models to a live and local production environment.

We’re thinking possibly ROS2/Kafka → Omniverse Isaac Sim?

Any ideas and insight would be welcome. We need to start building it out shortly.

Hi @rich25. Yes, Omniverse can work well for this.

You can listen for changes to a scene with omni.usd.get_watcher(), or create custom events and listen for them, or check for changes on every update loop.

For I/O, you could use the Python requests library to perform REST calls to your web server. Alternatively, your Omniverse simulation could include a microservice that can query the scene. We support HTTP and Kafka transports for our microservices: Introduction — Omniverse Microservices documentation

Hopefully this will start you in the right direction. Let me know if you have any more questions.

Hey Mati-

Thanks for getting back to me.

There doesn’t seem to be anywhere that explains the how of building these things. I’m trying to code an MVP, and there’s no step 1, just step 5 and on, know what I mean?

Sure, I can use a Python script, but where do I code it? How does it run when I start the application?

Then, questions about developer environments, so how do I save these files and the application to something like a github repo for our team to share?

Lastly, we’ll need to use Redis streaming, as Kafka is too heavy. So where do I set up the code to run the stream on the app? Is it in Omniverse Kit? Can I run simulations from the streaming data from Omniverse kit? Do I have to download Isaac or something? None of this is clear.

These are all questions that just get buried under so much information about features, that I can’t practically see how to implement this with our team at the moment.

Any clarification would be helpful.

Thank you!

No problem. It is a lot to take in. I just wanted to highlight that I do believe the tech is there for the problem you’re looking to solve.

If you haven’t already, I think you should start with these:

I watched all the videos, and I’m sorry, but it still really isn’t clear as to how Omniverse Kit can be used in a functional pipeline.

Let’s use the example of Redis, which may help illustrate the limitations I’m seeing.

Our app will be deployed headless, with the 3D simulated environment from the Kit as just one small part of the pipeline for simulation. The Kit (I think) is to ingest tons of data. In this case, using Redis streaming. So, I have to set up a Redis client on the app.

Ok, no problem, normally you install Redis and use a simple script with the stream key. Make sure Redis is running on a separate docker container and:

pip install redis
python consumer.py

Boom, you’re streaming data into a consumer that can be accessed in the python code.

Where do I do the pip install part? I can refer to the pipapi extension, but there isn’t an actual example of a working implementation: omni.kit.pipapi — kit-sdk 103.1 documentation

How do I know? I add the

[python.pipapi]
requirements = [
    ...,
   "redis==4.3.4",
    ...
]

to my extensions.toml file in my new extension, and add this to the top of my extension.py:

omni.kit.pipapi.install(
    package="redis",
    version="4.3.4",
)

import redis

Guess what happens? Total freeze of the system. An actual working example would be helpful, as I’ve already lost a full day of work trying to deal with the simple basics here. I had to do a deep search to find out about pipapi to start with, which you would think would be covered in every video.

But that’s what I see.

Am I just all wrong? Should we be looking at something entirely different to take data from the real world, map it to digital twins that can run in a physics based environment, run processing on the digital twins and their environment, and interact with our servers?

Hi @rich25. There are many ways that you could incorporate real-world data. If you’re pipeline is already using redis, I think that’s a fine approach.

We recently released a video about pipapi to help with discoverability of this feature: PIP Package Installer for Python - YouTube. The install command that you shared works for me, but it does block the UI thread the first time you run it as it is downloading and installing the package. This is why we recommend including your dependencies as a part of your extension so that your users don’t experience that.

Thanks for getting back to me @mati-nvidia . Sorry for the late reply, I had to get some of the ingested data structures working that took awhile.

I’m hoping someone can explain this simply to me.

My goal is to never have to launch the application as it will run headless. The Isaac simulation environment is just part of a larger application pipeline.

Success at this point will come from being able to take an (x,y,z) coordinate that streams in from Redis and then gets added (after being passed to another pre-processing function) in the Isaac Sim which I can update continuously.

It seems to me the Redis server feed should init itself in Omniverse. I can’t open an extension and click anything, it has to load programmatically via a service and be available in the Isaac app. That is, if I get he data in an extension, it’s not clear how to access that in the rest of the Isaac app via Python.

My guess is the full python app has to parse that Redis data and add and remove things in the 3D Isaac sim world based on that data.

So this makes for a couple of questions that I’m having challenges wrapping my head around.

Could you please explain some examples of how you would do this (where does the main app go, how do I connect it to Isaac sim, how do I manage the python layer and Isaac later), because I don’t seem to be able to ask the right questions at the moment.

Yup. It’s sounding like you want to make a microservice extension. Have a look at these two videos:

1 Like