I found the peoplenet AMR model here and wanted to try it out. I tried converting it to an engine file with:
/usr/src/tensorrt/bin/trtexec --onnx=/models/pn_robotics_v03_ph0_noreg.onnx --saveEngine=/models/amr_peoplenet.engine
however i get the error:
[04/23/2025-17:12:14] [I]
double free or corruption (out)
Aborted (core dumped)
Attatched is the full log: peoplenet_engine_gen.log (4.2 KB)
Runtime Env:
- Host: ORIN AGX
- Jetpack: 6.2, L4T 36.4.3
- Container:
dustynv/jetson-inference:r36.3.0
- CUDA container:
cuda-12.2
- CUDA host:
cuda-12.6
- TensorRT:
8.6.2.3-1+cuda12.2 (as shown by dpkg -l | grep TensorRT: ii tensorrt 8.6.2.3-1+cuda12.2 arm64 Meta package for TensorRT)
Thank you in advance!
Hi,
We tested the model with TensorRT 10.3 of JetPack 6.2 and it can run correctly.
$ /usr/src/tensorrt/bin/trtexec --onnx=pn_robotics_v03_ph0_noreg.onnx --saveEngine=amr_peoplenet.engine --warmUp=1000
&&&& RUNNING TensorRT.trtexec [TensorRT v100300] # /usr/src/tensorrt/bin/trtexec --onnx=pn_robotics_v03_ph0_noreg.onnx --saveEngine=amr_peoplenet.engine --warmUp=1000
...
&&&& PASSED TensorRT.trtexec [TensorRT v100300] # /usr/src/tensorrt/bin/trtexec --onnx=pn_robotics_v03_ph0_noreg.onnx --saveEngine=amr_peoplenet.engine --warmUp=1000
Thanks.
is 10.3 on the host or inside the container?
Hi
We test it on Jetson natively (outside of the container).
Thanks.
1 Like
system
Closed
7
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.