Jetson-inference --model=peoplenet , no engine file

I refer to the link guide to use Using TAO Detection Models. The hardware is Xavier NX, jetpack 5.1.2.

error logs:

What have under networks folder:

Looks like something wrong with peoplenet model. I can success with SSD-Mobilenet-v2 and SSD-Inception-v2 model.

Hi @zt_minto, can you please provide the command-line that you ran to start, along with the full console log (not a screenshot)? Thanks!


the cmd is: ./ --model=peoplenet pedestrians.mp4 pedestrians_peoplenet.mp4
the full log is attached. The pedestrians.mp4 is under same folder with
log.txt (7.5 KB)

Hi @zt_minto, I just tried this on JetPack 5.1.2 with the same video and model, and was able to run it without issues - twice actually (once for the first time it generates the TRT engine, and again for when it just loads the already serialized TRT engine)

I would recommend deleting your jetson-inference/data/networks/peoplenet_deployable_quantized_v2.6.1 folder and trying again, perhaps there is something corrupted with your model.

Hi Dusty_nv,
Thanks for your help. The first time I run --model=perplenet, it will download peoplenet_deployable_quantized_v2.6.1 automatically but not all files can success download. The passed files including resnet34_peoplenet_int8.etlt resnet34_peoplenet_int8.txt labels.txt, the failed one is colors.txt. Then I download it from

Could you paste the worked peoplenet_deployable_quantized_v2.6.1 folder here? Thanks!

Yes, here it is:

HI Dusty_nv,
I just extract the file
tar -zxvf peoplenet_deployable_quantized_v2.6.1.tar.gz

Then ran
./ --model=peoplenet pedestrians.mp4 pedestrians_peoplenet.mp4
The error still exist as below:
[TRT] completed loading NVIDIA plugins.
[TRT] detected model format - engine (extension ‘.engine’)
[TRT] loading network plan from engine cache…
[TRT] failed to load engine cache from
[TRT] failed to load
[TRT] detectNet – failed to initialize.
Traceback (most recent call last):
File “./”, line 53, in
net = detectNet(, sys.argv, args.threshold)
Exception: jetson.inference – detectNet failed to load network

I still can success with default model.
./ pedestrians.mp4 pedestrians_peoplenet.mp4

Hmm that is curious, because I can run the exact same thing here without issue. What happens if you run it using detectnet (C++ version) instead of

Hi dusty_nv,
the C++ detectnet has the similar error as attached.
note.txt (7.4 KB)

The log has a Tensor error: [TRT] Could not register plugin creator - ::FlattenConcat_TRT version 1
is it the reason?

adding the correct logs with default model.
note2.txt (23.0 KB)

After enter CMD, the mp4 video detection will show up after serveral seconds.

Hi @zt_minto, I’m sorry, unfortunately I’m unable to determine the cause of your issue, even after running the same commands on the same version of JetPack (which strangely both run fine here). I’m wondering if perhaps the version of jetson-inference code you are running is older, are you able to update it to the latest and recompile? Or perhaps use the jetson-inference docker container to check it. Sorry again about that!

Hi dusty_nv,
Thanks for your experiments and sorry for the late response.
I used a new Orin NX module to test. But the result is same.
I also tried it with docker, also meet the same error. Attaching the log.
log_docker.txt (13.9 KB)