I can run any model like <model> <input source> rtp://<ip>:<port>
BUT in the case of actionnet and backgrounder seems they cannot be found as commands.
It’s any special requirement for this, what should I do, and how can I check if I have all the necessary files for this?
If I don’t have them, how can I get them?
Hi @deeman, these types of models were added to jetson-inference recently, so you may need to update your repo and re-build/re-install the project to use them. And if you are using the container, you may need to pull the latest image to update yours (although I’ve only updated the container images for L4T R32.7.1, L4T R35.1, and L4T R35.2)