Running manipulation-sample-applications hit error

I following this tutorial

when I run bazel run apps/samples/manipulation:shuffle_box and i hit error

I setup Isaac follow https://docs.nvidia.com/isaac/isaac/doc/setup.html with clean os install
Tutorial Warehouse Navigation with Carter work fine.
I had googled around and get some clue maybe rebuild TensorRT might help, but how to rebuild TensorRT with default Isaac setup?

2020-09-19 09:12:23.632 DEBUG packages/ml/TensorRTInference.cpp@444: Attempting to create inference runtime for TensorRT engine: external/sortbot_pose_estimation_models/resnet18_detector_kltSmall.plan
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: INVALID_CONFIG: The engine plan file is generated on an incompatible device, expecting compute 7.5 got compute 6.1, please rebuild.
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: engine.cpp (1324) - Serialization Error in deserialize: 0 (Core engine deserialization failure)
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: INVALID_STATE: std::exception
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.

| Isaac application terminated unexpectedly |

Hello there,
first of all please use ` char or ``` string at the beginning and end of your code like this :

2020-09-19 09:12:23.632 DEBUG packages/ml/TensorRTInference.cpp@444: Attempting to create inference runtime for TensorRT engine: external/sortbot_pose_estimation_models/resnet18_detector_kltSmall.plan
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: INVALID_CONFIG: The engine plan file is generated on an incompatible device, expecting compute 7.5 got compute 6.1, please rebuild.
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: engine.cpp (1324) - Serialization Error in deserialize: 0 (Core engine deserialization failure)
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: INVALID_STATE: std::exception
2020-09-19 09:12:23.871 ERROR packages/ml/TensorRTInference.cpp@168: TRT ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
| Isaac application terminated unexpectedly |

Maybe “is generated on an incompatible device, expecting compute 7.5 got compute 6.1” mean that you device is not powerfull enough to run the app, looking at TensorRTInference.cpp file around the line168 could make you better understand why the app throw this exeption.

Regards,
Planktos

Thank you for reply.😃
I using RTX 2060.
Any documentation to rebuild plan file for tensor rt?

I solved the issue,
locate resnet18_detector_kltSmall.plan
and delete this “resnet18_detector_kltSmall.plan” file and bazel with isaac will generate a correct version(I guess)

and it run successfully.