Deepstream_test1_rtsp_in_rtsp_out.py example not working

• Hardware Platform (GPU) GPU A30
• DeepStream Version 6.4
• TensorRT Version 10.0.0.6
• NVIDIA GPU Driver Version (valid for GPU only) 535.104.12
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) Run the deepstream_test1_rtsp_in_rtsp_out.py

No encoder compatibility for A30.

I was wondering if there is any alternative for the nvv4l2h264enc to encoder the video for the rtppay.

Can you tell us what kind of use case with video encoding you need to run on A30?

There are lots of GStreamer software encoder available. Plugins (gstreamer.freedesktop.org). You can modify the code to replace the nvv4l2h264enc with any software encoder as you like.

Thanks for the answer, my use case is an application whose input is rtsp, infer, tracker(send this meta with kafka for post analytics) and finally the output in rtsp format.
In the example, if I change the encoder with avdec_h264 from the Plugins (gstreamer.freedesktop.org) I have this error:

0:00:06.207401467 611592 0x560c5af58a30 WARN                 nvinfer gstnvinfer.cpp:2406:gst_nvinfer_output_loop:<primary-inference> error: Internal data stream error.
0:00:06.207416276 611592 0x560c5af58a30 WARN            
     nvinfer gstnvinfer.cpp:2406:gst_nvinfer_output_loop
:<primary-inference> error: streaming stopped, reason not-linked (-1)
Error: gst-stream-error-quark: Internal data stream error. (1): gstnvinfer.cpp(2406): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
streaming stopped, reason not-linked (-1)
Frame Number= 1
Frame Number= 2
Frame Number= 3
Frame Number= 4
Frame Number= 5
Frame Number= 6
 

I’m upgrading to 7.0 anyway, I’d like to know if there is something like my use case

This is video decoder but not encoder. Why did you use this one?

Take “x264enc” as an example, a typical DeepStream inferencing and encoding pipeline could be:

gst-launch-1.0 uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! mux.sink_0 nvstreammux name=mux batch-size=1 width=1920 height=1080 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream/samples/configs/deepstream-app/config_infer_primary.txt ! nvmultistreamtiler columns=1 rows=1 width=1920 height=1080 ! nvdsosd ! nvvideoconvert ! x264enc ! h264parse ! q.video_0 qtmux name=q ! filesink location=test.mp4

Sorry for the mistake, I tried a lot of encoders and I don’t know how I finally post that decoder.
nyway, I have the same error. Deepstream Version: 7.0

# Make the encoder
if codec == "H264":
    encoder = Gst.ElementFactory.make("x264enc", "encoder")
    print("Creating H264 Encoder")
h264parser = Gst.ElementFactory.make("h264parse", "h264-parser")
    if not h264parser:
        sys.stderr.write(" Unable to create h264 parser")

nvvidconv_postosd.link(caps)
caps.link(encoder)
encoder.link(h264parser)
h264parser.link(rtppay)
rtppay.link(sink)
gstname= video/x-raw
features= <Gst.CapsFeatures object at 0x7fda6d3e6f80 (GstCapsFeatures at 0x7fd9ac40fa30)>
Frame Number= 0
0:00:52.106217341 3314362 0x5649508813f0 WARN                 nvinfer gstnvinfer.cpp:2420:gst_nvinfer_output_loop:<primary-inference> error: Internal data stream error.
0:00:52.106229284 3314362 0x5649508813f0 WARN                 nvinfer gstnvinfer.cpp:2420:gst_nvinfer_output_loop:<primary-inference> error: streaming stopped, reason not-linked (-1)
Error: gst-stream-error-quark: Internal data stream error. (1): gstnvinfer.cpp(2420): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
streaming stopped, reason not-linked (-1)
Frame Number= 1
Frame Number= 2
Frame Number= 3
Frame Number= 4
Frame Number= 5

Have you changed the “caps” to the proper one?

Now that you mention it, I’ve tried changing it to RGBA, I don’t understand what the function of this one is either.

GST_CAPS_FEATURES_NVMM = "memory:NVMM"
# Create a caps filter
    caps = Gst.ElementFactory.make("capsfilter", "filter")
    caps.set_property(
        "caps", Gst.Caps.from_string("video/x-raw(memory:NVMM), format=RGBA")
    )
gstname= video/x-raw
features= <Gst.CapsFeatures object at 0x7fdf0a8f0b80 (GstCapsFeatures at 0x7fde5840fe30)>
Frame Number= 0
0:00:49.122128993 3383739 0x56090ff8bd30 WARN                 nvinfer gstnvinfer.cpp:2420:gst_nvinfer_output_loop:<primary-inference> error: Internal data stream error.
0:00:49.122149433 3383739 0x56090ff8bd30 WARN                 nvinfer gstnvinfer.cpp:2420:gst_nvinfer_output_loop:<primary-inference> error: streaming stopped, reason not-linked (-1)
Frame Number= 1
Error: gst-stream-error-quark: Internal data stream error. (1): gstnvinfer.cpp(2420): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
streaming stopped, reason not-linked (-1)
Frame Number= 2
Frame Number= 3
Frame Number= 4
Frame Number= 5

Please make sure you are familiar with GStreamer before you start with DeepStream.

x264enc is a software encoder while “video/x-raw(memory:NVMM), format=RGBA” means the Nvidia hardware buffer. Please use “gst-inspect-1.0 x264enc” to check which capabilities the encoder can accept.

Thanks, I will take a look at the documentation to know which one I should use.

Anyway changing the format does not fix the problem, now instead of having not linked I have not negotiated

 # Create a caps filter
    caps = Gst.ElementFactory.make("capsfilter", "filter")
    caps.set_property(
        "caps", Gst.Caps.from_string("video/x-raw(memory:NVMM), format=NV12")
    )
features= <Gst.CapsFeatures object at 0x7fa02bf10ac0 (GstCapsFeatures at 0x7f9f6840dcb0)>
Frame Number= 0
0:00:49.660886534 3435168 0x561dc4de7ed0 WARN                 nvinfer gstnvinfer.cpp:2420:gst_nvinfer_output_loop:<primary-inference> error: Internal data stream error.
0:00:49.660898256 3435168 0x561dc4de7ed0 WARN                 nvinfer gstnvinfer.cpp:2420:gst_nvinfer_output_loop:<primary-inference> error: streaming stopped, reason not-negotiated (-4)
Error: gst-stream-error-quark: Internal data stream error. (1): gstnvinfer.cpp(2420): gst_nvinfer_output_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference:
streaming stopped, reason not-negotiated (-4)
Frame Number= 1
Frame Number= 2
Frame Number= 3
Frame Number= 4
Frame Number= 5
Frame Number= 6

By the way, the example by the default has I420 which is one of the formats that x264enc accepts,

The problem is not the format, it is the caps. The “video/x-raw(memory:NVMM), format=NV12” is for the Nvidia hardware buffer. You need a software buffer.

It’s finally working, deleted the “(memory:NVMM)” from the caps