DeepStream samples fail in fresh docker-container on centos 7.9 host system: Device is in streaming mode

Dear Community,

I’m struggling with this problem for a bit now and I hope you can help me:

I have 2 identical setups, it works on setup 1, but not on setup 2

FIRST SETUP (WORKING)
AWS Machine g4dn.xlarge with centos8-stream AMI. I install nvidia-drivers, docker, nvidia-docker

SECOND SETUP (BREAKING)
AWS Machine g4dn.xlarge with centos7.9 AMI. I install nvidia-drivers, docker, nvidia-docker

Now I run on both machines:
$ docker run -it --gpus all nvcr.io/nvidia/deepstream:6.1-samples bash
I go to samples/configs/deepstream-app/source30_1080p_dec_infer-resnet_tiled_display_int8.txt
and change the sink to file out.mp4 like this:

[sink0]
enable=0
...

[sink1]
enable=1

Now I’m running deepstream-app -c samples/configs/deepstream-app/source30_1080p_dec_infer-resnet_tiled_display_int8.txt

This works perfectly on Setup 1 with centos8-stream, but not with centos7.9.

Centos 7.9 Scrollback

0:00:02.207629978 632 0x55aeedf26160 INFO nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1900> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-6.1/samples/configs/deepstream-app/…/…/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine
INFO: …/nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40

0:00:02.230590450 632 0x55aeedf26160 INFO nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2003> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-6.1/samples/configs/deepstream-app/…/…/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine
0:00:02.234195125 632 0x55aeedf26160 INFO nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-6.1/samples/configs/deepstream-app/config_infer_primary.txt sucessfully

Runtime commands:
h: Print this help
q: Quit

    p: Pause
    r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:194>: Pipeline ready

ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
Quitting
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
ERROR from sink_sub_bin_encoder1: Could not get/set settings from/on resource.
Debug info: gstv4l2object.c(3507): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline/GstBin:processing_bin_0/GstBin:sink_bin/GstBin:sink_sub_bin1/nvv4l2h264enc:sink_sub_bin_encoder1:
Device is in streaming mode
App run failed

nvidia-smi on setup 1 (centos8-stream), just the top line:

Host
NVIDIA-SMI 515.48.07 Driver Version: 515.48.07 CUDA Version: 11.7

Container
NVIDIA-SMI 515.48.07 Driver Version: 515.48.07 CUDA Version: 11.7

nvidia-smi on setup 2( centos7.9), just the top line:
Host
NVIDIA-SMI 520.61.05 Driver Version: 520.61.05 CUDA Version: 11.8
Container
NVIDIA-SMI 520.61.05 Driver Version: 520.61.05 CUDA Version: 11.8

The host should meet the platform and OS compatibility here: Quickstart Guide — DeepStream 6.1.1 Release documentation (nvidia.com)

So, to clarify: “OS” in this compatibility matrix means Host OS? So I can’t run Deepstream on a centos7.9 host inside a container ubuntu 20.04 container?

Update:

RHEL8 works as Host OS even if it’s not supported. At least to my tests. Also, I just found out that I can fix the “device is in streaming mode”-error onn centos7.9 by switching from hardware to software encoder using this in the example-config:

[sink1]
...
# encoder type (0: hardware, 1: software)
enc-type=1

What keeps bugging me is that this is only the encoder. So apparently, the decoder works perfectly in hardware, but the encoder doesnt? Am I correct in this assumption?

Cheers,
Dennis

Can you tell us your GPU type?

Its a aws g4dn xlarge machine, so its an nvidia T4

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 520.61.05    Driver Version: 520.61.05    CUDA Version: 11.8     |
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  Tesla T4            Off  | 00000000:00:1E.0 Off |                    0 |
| N/A   40C    P0    25W /  70W |      2MiB / 15360MiB |      5%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

+-----------------------------------------------------------------------------+
| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|=============================================================================|
|  No running processes found                                                 |
+-----------------------------------------------------------------------------+

Can you check whether there is video codec exist in the docker? If they are there, you can try the following command to enable the codec.

ln -s /usr/lib/x86_64-linux-gnu/libnvcuvid.so.1 /usr/lib/x86_64-linux-gnu/libnvcuvid.so
ln -s /usr/lib/x86_64-linux-gnu/libnvidia-encode.so.1 /usr/lib/x86_64-linux-gnu/libnvidia-encode.so

How do I check if the video codec is existing, do you mean like this?

root@82c1b0b46f40:/opt/nvidia/deepstream/deepstream-6.1# gst-inspect-1.0 | grep 264
nvvideo4linux2:  nvv4l2h264enc: V4L2 H.264 Encoder
x264:  x264enc: x264enc
videoparsersbad:  h264parse: H.264 parser
uvch264:  uvch264mjpgdemux: UVC H264 MJPG Demuxer
uvch264:  uvch264src: UVC H264 Source
typefindfunctions: video/x-h264: h264, x264, 264
rtp:  rtph264depay: RTP H264 depayloader
rtp:  rtph264pay: RTP H264 payloader
libav:  avenc_h264_omx: libav OpenMAX IL H.264 video encoder encoder
libav:  avdec_h264: libav H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 decoder
libav:  avmux_ipod: libav iPod H.264 MP4 (MPEG-4 Part 14) muxer
root@82c1b0b46f40:/opt/nvidia/deepstream/deepstream-6.1#

both files are already linked, see:

root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file /usr/lib/x86_64-linux-gnu/libnvcuvid.so
/usr/lib/x86_64-linux-gnu/libnvcuvid.so: symbolic link to /usr/lib/x86_64-linux-gnu/libnvcuvid.so.1
root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file /usr/lib/x86_64-linux-gnu/libnvcuvid.so.1
/usr/lib/x86_64-linux-gnu/libnvcuvid.so.1: symbolic link to libnvcuvid.so.520.61.05
root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file libnvcuvid.so.520.61.05
libnvcuvid.so.520.61.05: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=c050b68160f4df1a1bb6d1132c79cdbd25173bde, stripped

and

root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file /usr/lib/x86_64-linux-gnu/libnvcuvid.so
/usr/lib/x86_64-linux-gnu/libnvcuvid.so: symbolic link to /usr/lib/x86_64-linux-gnu/libnvcuvid.so.1
root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file /usr/lib/x86_64-linux-gnu/libnvcuvid.so.1
/usr/lib/x86_64-linux-gnu/libnvcuvid.so.1: symbolic link to libnvcuvid.so.520.61.05
root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file libnvcuvid.so.520.61.05
libnvcuvid.so.520.61.05: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=c050b68160f4df1a1bb6d1132c79cdbd25173bde, stripped
root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file /usr/lib/x86_64-linux-gnu/libnvidia-encode.so.1
/usr/lib/x86_64-linux-gnu/libnvidia-encode.so.1: symbolic link to libnvidia-encode.so.520.61.05
root@82c1b0b46f40:/usr/lib/x86_64-linux-gnu# file libnvidia-encode.so.520.61.05
libnvidia-encode.so.520.61.05: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=bc1d39c20e018caae745a9b0ed27d62cec6c25b0, stripped

this does not change the issue.

Cheers

Just to keep this thread complete: nvidia container toolkit supports a centos7 host system and can run an ubuntu 20.04 inside the container. See: Installation Guide — NVIDIA Cloud Native Technologies documentation

Dear @Fiona.Chen,
any news on this ?

Cheers,
Dennis

Which DeepStream version are you using? If it is 6.1.1, the CUDA version must be 11.7 and driver version should be 515.65.01.

https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Quickstart.html#platform-and-os-compatibility

Which cuda version are you talking about, is it the one inside the container or the cuda version on the host system?

Cheers

The host CUDA version.

Dear @Fiona.Chen,
thanks for your support.
In the failing test setup the host cuda version is 11.8 driver version and 10.0 for the runtime version. I can try to downgrade to 11.7, but to be quite honest, I don’t think a single minor version update breaks the nvidia-docker runtime that will bring up 11.7 in the container.

This is extremly easy to reproduce, because you can just get a g4dn.xlarge AWS machine with a CentOS7.9 AMI, install the respective nvidia drivers, docker and nvidia-docker and run the sample yourself. Maybe a developer should have a look into that.

I just also tried to run it on 10.x driver and runtime version on the host and it will automatically call a built-in compatibility mode running such that the cuda version in the container is 11.7 leading to the same error in the hardware encoding. The hardware decode works nevertheless in deepstream 6.1 on a 10.x host system due to the compatibility-mode. What has this compatibility-mode been established for if not such a use-case? What I’m trying to say here is: I think it’s not a host-system issue, but a bug.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one.
Thanks

I’ve mentioned the compatibility. Some of the libraries in DeepStream docker are mapped from host but not installed inside the images. So it is important to make sure the host is compatible too.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.