I want to deploy the follow_me sample* (as found in Isaac/apps/samples/follow_me) on the NVIDIA Carter robot.
I build the codelet using bazel and deploy it to the Jetson Xavier. When I run the application it already shows the output of the ZED camera and also correctly detects the April tags.
Now I want it to follow the April tags, i.e move to the tag. I only got two problems:
- I don’t have an obstacle map
- The controller is only working with the remote computer (and not on the Jetson)
My questions are:
- Is it obligatory to have an obstacle map? And if so, how can I create one (using the Carter)?
- Does the controller need to be connected to the remote or host pc (i.e. Carter)?
- The control of the Carter via the virtual gamepad with the keyboard of my host notebook works fine so far. But I see no option to also steer the deadman-switch-button using the keyboard of the host computer.
Is there any possibilty to control the deadman-switch via the host notebook (e.g., by keyboard)?
Thanks for your help,
'* This is the documentation for the Kaya follow_me sample, but there exists also a generall follow_me sample.