Deploying the follow_me Codelet on Carter using a Virtual Gamepad

Dear Community,

I want to deploy the follow_me sample* (as found in Isaac/apps/samples/follow_me) on the NVIDIA Carter robot.

I build the codelet using bazel and deploy it to the Jetson Xavier. When I run the application it already shows the output of the ZED camera and also correctly detects the April tags.

Now I want it to follow the April tags, i.e move to the tag. I only got two problems:

  1. I don’t have an obstacle map
  2. The controller is only working with the remote computer (and not on the Jetson)

My questions are:

  1. Is it obligatory to have an obstacle map? And if so, how can I create one (using the Carter)?
  2. Does the controller need to be connected to the remote or host pc (i.e. Carter)?
  3. The control of the Carter via the virtual gamepad with the keyboard of my host notebook works fine so far. But I see no option to also steer the deadman-switch-button using the keyboard of the host computer.
    Is there any possibilty to control the deadman-switch via the host notebook (e.g., by keyboard)?

Thanks for your help,


'* This is the documentation for the Kaya follow_me sample, but there exists also a generall follow_me sample.

I had the same question, how does one to activate/ configure the deadman-switch in the host keyboard. Should it be added to the gamepad widget or is deadman-switch applicable only to px4 like joystick?