How to use Inference API Extensions

I want to run an pytorch inference work in omniverse extension, but i can not find the Inference API Extensions mentioned in the documentation。

And I want to know how to use these code found in documentation:

What libraries and dependencies do I need to import?


Hi @mmmmark! To get at these inference extensions, do you see this in Code/Create when you search “inference” in the Extension Manager? You just need to install and enable the extension.

We have some more documentation here: Inference API Extensions — Omniverse Extensions documentation, but I agree some imports could help. For the most part, it look like the imports should look like this:

from omni.inference.torch import TorchBackend

I’ll ask the team to post the API docs for these extensions. For now, you can see here how I inspected the API for this extension: 6 Crucial Tips For New NVIDIA Omniverse Developers - YouTube

Thanks @mati-nvidia , I have found these extensions(omni.inference.torch and omni.pip.torch) in Create’s Extension Manager but not in Code(v2022.3.1).
So if that’s I have to do this work in Create? I want to use the vscode link to debug, but it does’t work in Create(Even if I have linked the extension path to Create)

@mati-nvidia I have install and enable those extensions in Create, but still can’t import like your reply

I have seen the installation part in Inference API Extensions like this:

so I should add this sentence to /config/extension.toml in my custom extension like this?

but I got a error about:

Maybe when I solve this problem I can use the inference API in OV extension?

or I use pip packages in ov to do a pytorch inference work by myself?

A few things:

  1. I had a typo in my original response. You want to import TorchBackend for the torch extension. You can test it in the Script Editor to confirm first.
  2. You need to update your .vscode/settings.json to add the path to the omni.inference.torch extension for the import warning to go away in VSCode and to get autocomplete.
  3. When you include the dependency in your extension, remove the tag key/value. That should solve the error.
1 Like

Thank you mati @mati-nvidia ,I have solved this these problem with your help,But I would also like to know if there are any more documents or examples about custom AI inference in omniverse.

Excellent! I don’t think so, but I’ll ask. You could look at the implementations for the AI ToyBox Extensions and possibly learn from those.

Hi @mmmmark. Our devs recommended these resources:

The GANverse3D repo has examples that are probably the clearest: GANverse3D Extension — Omniverse Extensions documentation
Animal Explorer does as well: AI Animal Explorer Extension — Omniverse Extensions documentation

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.