I’m trying to run smart recording with software encoder in Jetson Xavier NX. The code is modified from deepstream-testsr but it reads a source file as input instead of a rtsp stream. According to the error below, a queue element couldn’t be linked.
aaeon@aaeon-desktop:/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-testsr$ sudo ./deepstream-testsr-app /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264
Now playing: /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264
Opening in BLOCKING MODE
0:00:04.886733366 19092 0x558d235660 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1900> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:04.886976409 19092 0x558d235660 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2004> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
0:00:04.896856800 19092 0x558d235660 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-nvinference-engine> [UID 1]: Load new model:dstestsr_pgie_config.txt sucessfully
Running...
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
Recording started..
ERROR from element queue-post-osd: Internal data stream error.
Error details: gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:dstest-sr-pipeline/GstQueue:queue-post-osd:
streaming stopped, reason not-linked (-1)
^C
I tried running on x86_64 in the environment below and it worked fine.
Architecture: x86_64
GPU: NVIDIA GeForce GTX 1650 Ti with Max-Q Design
NVIDIA GPU Driver: Driver Version: 495.29.05
DeepStream Version: 6.0 (running on docker image nvcr.io/nvidia/deepstream:6.0-devel)
TensorRT Version: v8001
I tried to reproduce this issue on Jetson Nano running in Docker (I couldn’t use Docker on my Jetson Xavier NX due to storage issue). I don’t see any errors. I suspect that it is due to environment issue? What do you think? in this case, I might need to reflash the device.
See the answers for your questions below
I don’t see any logs
aaeon@aaeon-desktop:/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-testsr$ export GST_DEBUG=4
aaeon@aaeon-desktop:/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-testsr$ sudo ./deepstream-testsr-app /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264
Now playing: /opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.h264
Opening in BLOCKING MODE
0:00:04.833428707 5451 0x559ffd7660 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1900> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:04.833680869 5451 0x559ffd7660 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2004> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
0:00:04.842195231 5451 0x559ffd7660 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-nvinference-engine> [UID 1]: Load new model:dstestsr_pgie_config.txt sucessfully
Running...
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
Recording started..
ERROR from element queue-post-osd: Internal data stream error.
Error details: gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:dstest-sr-pipeline/GstQueue:queue-post-osd:
streaming stopped, reason not-linked (-1)
^C
I got the following error.
aaeon@aaeon-desktop:/opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-testsr$ sudo ./deepstream-testsr-app rtsp://xxx
Now playing: rtsp://xxx
Using winsys: x11
Opening in BLOCKING MODE
Opening in BLOCKING MODE
0:00:04.998157412 24328 0x558916a470 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1900> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40
0:00:04.998375398 24328 0x558916a470 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2004> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
0:00:05.008059209 24328 0x558916a470 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary-nvinference-engine> [UID 1]: Load new model:dstestsr_pgie_config.txt sucessfully
Running...
In cb_newpad
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
H264: Profile = 66, Level = 0
0:00:05.439852994 24328 0x5589789400 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop:<primary-nvinference-engine> error: Internal data stream error.
0:00:05.439918051 24328 0x5589789400 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop:<primary-nvinference-engine> error: streaming stopped, reason error (-5)
ERROR from element primary-nvinference-engine: Internal data stream error.
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(2288): gst_nvinfer_output_loop (): /GstPipeline:dstest-sr-pipeline/GstNvInfer:primary-nvinference-engine:
streaming stopped, reason error (-5)
Returned, stopping playback
NVMEDIA_ENC: bBlitMode is set to TRUE
Deleting pipeline
the log’s information is not enough, it seemed to be an environment issue because there will be many printing after “export GST_DEBUG=4”, you can redirect it to a file, like this:
./deepstream-testsr-app rtsp://xx
0:00:00.000126055 769573 0xaaaad7350c00 INFO GST_INIT gst.c:586:init_pre: Initializing GStreamer Core Library version 1.16.3