Hi @mmmmark! To get at these inference extensions, do you see this in Code/Create when you search “inference” in the Extension Manager? You just need to install and enable the extension.
Thanks @mati-nvidia , I have found these extensions(omni.inference.torch and omni.pip.torch) in Create’s Extension Manager but not in Code(v2022.3.1).
So if that’s I have to do this work in Create? I want to use the vscode link to debug, but it does’t work in Create(Even if I have linked the extension path to Create)
I had a typo in my original response. You want to import TorchBackend for the torch extension. You can test it in the Script Editor to confirm first.
You need to update your .vscode/settings.json to add the path to the omni.inference.torch extension for the import warning to go away in VSCode and to get autocomplete.
When you include the dependency in your extension, remove the tag key/value. That should solve the error.
Thank you mati @mati-nvidia ,I have solved this these problem with your help,But I would also like to know if there are any more documents or examples about custom AI inference in omniverse.