Running PeopleNet on x86 with Jetson Inference Container

Hey, I have a project where I want to run Automatic labeling of people on images. I Have it running on my Jetson device. I am using the pre trained PeopleNet. Now I want to also have it running on my normal linux PC with a 1080ti. I did build the docker container but it can not load PeopleNet.
jetson.inference – detectNet loading build-in network ‘peoplenet’
jetson.inference – detectNet invalid built-in network was requested (‘peoplenet’)
Traceback (most recent call last):
File “Automatic_Labeling.py”, line 85, in
results = detect_people_in_image(image)
File “Automatic_Labeling.py”, line 50, in detect_people_in_image
net = detectNet(“peoplenet”, threshold=0.01)
Exception: jetson.inference – detectNet invalid built-in network was requested

I also tried to load PeopleNet (deployable_quantized_onnx_v2.6.2) myself and then transform it with trtexec to a .engine file but there I only had empty boundinboxes and no detections.
Can you support me in this topic? You can also tell me if there is a better option to do this project.

Hi,

Could you checkout the x86 branch and try it again?

Thanks.

Hi,
so as explained I am using my own container. I build it with the base image from Dusty for amd64. (with dustynv/jetson-inference:22.06)
I need this because I need to install some company related stuff. When I understand it correctly it should download peoplenet by itself when I call it, or do I miss understand something? I also don’t understand why I should clone the git site.

Sorry for the late awnser I was on vaccation.

Hi

Models are downloaded by CMakePreBuild.sh script.
If it doesn’t download automatically, would you mind running it manually?

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.