Pose container demo issue

I am trying to run the Pose demo ( https://ngc.nvidia.com/catalog/containers/nvidia:jetson-pose)
but I get the below error log
Can you kinly let me know what I am missing?
I am on Jetson NX with jetpack 4.4 L4T 32.4.3 .
Thank you!

sudo docker run --runtime nvidia -it --rm --network host -e DISPLAY=$DISPLAY -v /tmp/.X11-unix/:/tmp/.X11-unix nvcr.io/nvidia/jetson-pose:r32.4.2 python3 run_pose_pipeline.py /videos/pose_video.mp4 --loop

(python3:1): GStreamer-CRITICAL **: 12:18:00.483: gst_caps_is_empty: assertion ‘GST_IS_CAPS (caps)’ failed

(python3:1): GStreamer-CRITICAL **: 12:18:00.483: gst_caps_truncate: assertion ‘GST_IS_CAPS (caps)’ failed

(python3:1): GStreamer-CRITICAL **: 12:18:00.483: gst_caps_fixate: assertion ‘GST_IS_CAPS (caps)’ failed

(python3:1): GStreamer-CRITICAL **: 12:18:00.483: gst_caps_get_structure: assertion ‘GST_IS_CAPS (caps)’ failed

(python3:1): GStreamer-CRITICAL **: 12:18:00.483: gst_structure_get_string: assertion ‘structure != NULL’ failed

(python3:1): GStreamer-CRITICAL **: 12:18:00.483: gst_mini_object_unref: assertion ‘mini_object != NULL’ failed
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
Allocating new output: 960x544 (x 11), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 960, nFrameHeight = 540
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (896) open OpenCV | GStreamer warning: unable to query duration of stream
[ WARN:0] global /home/nvidia/host/build_opencv/nv_opencv/modules/videoio/src/cap_gstreamer.cpp (933) open OpenCV | GStreamer warning: Cannot query video position: status=1, value=5, duration=-1

Using winsys: x11
[TensorRT] ERROR: coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match)
[TensorRT] ERROR: INVALID_STATE: std::exception
[TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
Traceback (most recent call last):
File “run_pose_pipeline.py”, line 37, in
engine = PoseEngine(ENGINE_PATH)
File “/pose/pose.py”, line 86, in init
File “/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py”, line 830, in load_state_dict
File “/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py”, line 825, in load
state_dict, prefix, local_metadata, True, missing_keys, unexpected_keys, error_msgs)
File “/usr/local/lib/python3.6/dist-packages/torch2trt/torch2trt.py”, line 309, in _load_from_state_dict
self.context = self.engine.create_execution_context()
AttributeError: ‘NoneType’ object has no attribute ‘create_execution_context’

Hi @yandssiegel, this container has not yet been updated for L4T R32.4.3 - note the tag of the jetson-pose container is r32.4.2. Please run it on the JetPack 4.4 Developer Preview for now (L4T R32.4.2).

Many thanks for your answer!
Sorry I did not notice the tag…
Will wait for update.

Hello … I’m from the cloud computing world and ran into this issue. I accept the answer but can someone explain why the host version is causing the container to fail? This typically does not happen in cloud-computing environment. Would appreciate an explanation.

1 Like

Hi @iqbaliia, it’s because currently, the NVIDIA drivers and CUDA toolkit are mounted dynamically into the container at runtime from the host device. This is done to reduce the size of the containers and memory overhead since the embedded Jetson devices typically have less resources than you might see in a cloud environment.

However, in the future we’ll be supporting containers where CUDA/ect is installed directly into the container (technically you could do this today by making your own base), thereby insulating them from the underlying JetPack version on the host device.

1 Like

Given the above what containers from the NGC database can I use on jetson nx? For example can I use this one ?


And also will all containers optimised for the gpu?

Thank you

@yandssiegel if the container doesn’t explicitly list support for Jetson/L4T, then it is probably built for PC/server and not Jetson.

The easiest way to find the containers that support Jetson is to search for "L4T":


For those that do support NX, check the tags to confirm that they were built for the same L4T version that you are using.

Ok understood, thank you