Nvv4l2h264enc plugin not working on jetson orin nano

Please provide complete information as applicable to your setup.

**• Hardware Platform (Jetson / GPU) jetson orin nano
**• DeepStream Version 6.3

Good day. I have developed a deepstream application on my dGPU desktop, and started to move it to a Jetson Orin Nano, but struggled when the frames had to be encoded. Sections where the data does not have to be encoded is working well, but only when streams are encoded, then the Jetson Orin nano is not able to process the frames. Here is a snippet of the code that shows how the encoder is being used:

    nvvidconv_postosd.link(queue)
    if is_aarch64():
        caps1 = Gst.Caps.from_string("video/x-raw(memory:NVMM), format=I420")
    else:
        caps1 = Gst.Caps.from_string("video/x-raw(memory:NVMM), format=RGBA")
    filter1 = Gst.ElementFactory.make("capsfilter", f"capsfilter_{streamID}")
    filter1.set_property("caps", caps1)
    pipeline.add(filter1)
    queue.link(filter1)
    queue = make_queue()
    pipeline.add(queue)    
    filter1.link(queue)
    encoder = Gst.ElementFactory.make("nvv4l2h264enc", f'encoder_{streamID}')
    encoder.set_property('bitrate', 2000000)
    # if is_aarch64():
    #    encoder.set_property("preset-level", 1)
    #    encoder.set_property("insert-sps-pps", 1)
    pipeline.add(encoder)
    queue.link(encoder)
    queue = make_queue()
    pipeline.add(queue)    
    encoder.link(queue)

    rtppay = Gst.ElementFactory.make("rtph264pay", f"rtppay_{streamID}")
    pipeline.add(rtppay)
    queue.link(rtppay)
    queue = make_queue()
    pipeline.add(queue)    
    rtppay.link(queue)   

Are there any additional configurations that are needed if I want to use the “nvv4l2h264enc” plugin on the jetson orin?

Hi @shaun.johnson

The Jetson Orin Nano does not include hardware units for video encoding, so you can’t use nvv4l2h264enc. We wrote a blog about software-based encoding alternatives: https://www.ridgerun.com/post/jetson-orin-nano-how-to-achieve-real-time-performance-for-video-encoding

1 Like

There is a page about software encoder for your reference: Software Encode in Orin Nano — Jetson Linux Developer Guide documentation (nvidia.com)

Thank you for the links! I have reviewed them, but am still unsure how to integrate the x264enc element in my deepstream application… This is what I have now:

    nvvidconv_postosd = Gst.ElementFactory.make("nvvideoconvert", f"convertor_postosd_{streamID}")
    pipeline.add(nvvidconv_postosd)
    queue.link(nvvidconv_postosd)
    queue = make_queue()
    pipeline.add(queue) 
    nvvidconv_postosd.link(queue)
    if is_aarch64():
        caps1 = Gst.Caps.from_string("video/x-raw(memory:NVMM), format=I420")
    else:
        caps1 = Gst.Caps.from_string("video/x-raw(memory:NVMM), format=RGBA")
    filter1 = Gst.ElementFactory.make("capsfilter", f"capsfilter_{streamID}")
    filter1.set_property("caps", caps1)
    pipeline.add(filter1)
    queue.link(filter1)
    queue = make_queue()
    pipeline.add(queue)    
    filter1.link(queue)
    if is_aarch64():
        encoder = Gst.ElementFactory.make("x264enc", f'encoder_{streamID}')
        encoder.set_property('bitrate', 2000000)
        print(encoder)
    else:
        encoder = Gst.ElementFactory.make("nvv4l2h264enc", f'encoder_{streamID}')
        encoder.set_property('bitrate', 2000000)

    pipeline.add(encoder)
    queue.link(encoder)
    queue = make_queue()
    pipeline.add(queue)    
    encoder.link(queue)

    rtppay = Gst.ElementFactory.make("rtph264pay", f"rtppay_{streamID}")
    pipeline.add(rtppay)
    queue.link(rtppay)
    queue = make_queue()
    pipeline.add(queue)    
    rtppay.link(queue)   
    
    sink = Gst.ElementFactory.make("udpsink", f"udpsink_{streamID}")
    sink.set_property('host', '224.224.255.255') #Subnet mask
    sink.set_property('port', 5400)
    sink.set_property('async', False)
    sink.set_property('sync', 1)
    sink.set_property("qos", 0)

I have used the “gst-inspect-1.0 x264enc” command to analyze the inputs and the outputs of the x264enc element, but I do not know the elements and plugins in Deepstream well enough to deduce how I can incorporate the “x264enc” element in my pipeline…Eventually I want to send the streams to another device over an IP network. Is there an example that I can follow, or a plugin that will work well for this use case?

You may refer to the sample here: GStreamer Pipeline Samples #GStreamer · GitHub

Thank you for the link, I have tried to adjust the pipeline (I am using the python bindings) so that it looks similar to the examples that are seen in the link. The pipeline is fortunately processing the stream, but I get this warning message:
“gstnvtracker: Unable to acquire a user meta buffer. Try increasing user-meta-pool-size”
And when I try to access the stream through VLC, then I get these messages:

sys:1: Warning: g_object_get_is_valid_property: object class ‘GstUDPSrc’ has no property named ‘pt’
0:00:10.950194150 8985 0x29c71ea0 WARN udpsrc gstudpsrc.c:1445:gst_udpsrc_open: warning: Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?
0:00:10.950264739 8985 0x29c71ea0 WARN udpsrc gstudpsrc.c:1455:gst_udpsrc_open: have udp buffer of 212992 bytes while 524288 were requested
0:00:10.950710734 8985 0x29c71ea0 WARN rtspmedia rtsp-media.c:3014:default_handle_message: 0xffff280789e0: got warning Could not get/set settings from/on resource. (gstudpsrc.c(1445): gst_udpsrc_open (): /GstPipeline:media-pipeline/GstBin:bin0/GstUDPSrc:pay0:
Could not create a buffer of requested 524288 bytes (Operation not permitted). Need net.admin privilege?)
0:00:10.981546207 8985 0x1c060e40 WARN rtspstream rtsp-stream.c:4124:gst_rtsp_stream_get_rtpinfo: Could not get payloader stats
0:00:10.981649722 8985 0x1c060e40 FIXME rtspmedia rtsp-media.c:4201:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:00:10.982509810 8985 0x1c060e40 FIXME rtspmedia rtsp-media.c:4201:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:00:10.982538001 8985 0x1c060e40 WARN rtspmedia rtsp-media.c:4227:gst_rtsp_media_suspend: media 0xffff280789e0 was not prepared
0:00:10.985579685 8985 0x1c060e40 FIXME rtspclient rtsp-client.c:1818:handle_play_request:GstRTSPClient@0xe6f2cf0 Add support for seek style (null)
0:00:10.985743037 8985 0x1c060e40 FIXME rtspmedia rtsp-media.c:2711:gst_rtsp_media_seek_full:GstRTSPMedia@0xffff280789e0 Handle going back to 0 for none live not seekable streams.
0:00:10.985815418 8985 0xfffe8c058a40 WARN basesink gstbasesink.c:1209:gst_base_sink_query_latency: warning: Pipeline construction is invalid, please add queues.
0:00:10.985849336 8985 0xfffe8c058a40 WARN basesink gstbasesink.c:1209:gst_base_sink_query_latency: warning: Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
0:00:10.985930388 8985 0x29c71ea0 WARN rtspmedia rtsp-media.c:3014:default_handle_message: 0xffff280789e0: got warning Pipeline construction is invalid, please add queues. (gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:media-pipeline/GstMultiUDPSink:multiudpsink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.)
count = 55
0:00:11.012993523 8985 0x1c060e40 WARN basesink gstbasesink.c:1209:gst_base_sink_query_latency: warning: Pipeline construction is invalid, please add queues.
0:00:11.014000613 8985 0x1c060e40 WARN basesink gstbasesink.c:1209:gst_base_sink_query_latency: warning: Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
0:00:11.014284375 8985 0x29c71ea0 WARN rtspmedia rtsp-media.c:3014:default_handle_message: 0xffff280789e0: got warning Pipeline construction is invalid, please add queues. (gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:media-pipeline/GstMultiUDPSink:multiudpsink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.)
0:00:11.040716115 8985 0x1c060e40 ERROR rtspclient rtsp-client.c:1439:handle_teardown_request: client 0xe6f2cf0: no aggregate path /stream-0/stream=0

Here is the section of code that creates the server:

    encoder.link(queue)

    rtppay = Gst.ElementFactory.make("rtph264pay", f"rtppay_{streamID}")
    pipeline.add(rtppay)
    queue.link(rtppay)
    queue = make_queue()
    pipeline.add(queue)    
    rtppay.link(queue)   
    
    sink = Gst.ElementFactory.make("udpsink", f"udpsink_{streamID}")
    sink.set_property('host', '224.224.255.255') #Subnet mask
    sink.set_property('port', 5400)
    sink.set_property('async', False)
    sink.set_property('sync', 1)
    sink.set_property("qos", 0)
    pipeline.add(sink)
    queue.link(sink)
    codec='H264'
    updsink_port_num=5400
    rtsp_port_num=8554    
    server = GstRtspServer.RTSPServer.new()
    server.props.service = "%d" % rtsp_port_num
    server.attach(None)
    factory = GstRtspServer.RTSPMediaFactory.new()
    factory.set_launch( "( udpsrc name=pay0 port=%d buffer-size=524288 caps=\"application/x-rtp, media=video, clock-rate=90000, encoding-name=(string)%s, payload=96 \" )" % (updsink_port_num, codec))        
    factory.set_shared(True)
    server.get_mount_points().add_factory(f"/stream-{streamID}", factory)
    print(f"Stream {streamID} is reachable with this URL: rtsp://IP:port/stream-{streamID}")
    print('Then streaming service should be set up within the next 5 seconds')

Can you share the whole picture of your pipeline?
Does this pipeline work on your desktop, but not work on your Jetson Nano?

Can you have a try with deepstream-app and config to Software encoder? If deepstream-app works fine in your side, you can refer deepstream-app for Python application.

  #encoder type 0=Hardware 1=Software
  enc-type: 0

Here is the image of the pipeline

I have made these changes to the configurations of the encoder:

    if is_aarch64():
        caps1 = Gst.Caps.from_string("video/x-raw, format=I420")
        filter1 = Gst.ElementFactory.make("capsfilter", f"capsfilter_{streamID}")
        filter1.set_property("caps", caps1)
        pipeline.add(filter1)
        queue.link(filter1)
        queue = make_queue()
        pipeline.add(queue)    
        filter1.link(queue)
    if is_aarch64():
        encoder = Gst.ElementFactory.make("x264enc", f'encoder_{streamID}')
        encoder.set_property('bitrate', 790000)
        encoder.set_property('speed-preset', 'ultrafast')
        encoder.set_property('tune', 'zerolatency')

The IP camera that I am using is a 4MP camera, which is very large. I decreased the size of the frames to 720p and then I stopped getting this message as often (If does still appear frequently)

“gstnvtracker: Unable to acquire a user meta buffer. Try increasing user-meta-pool-size”

However-I can still not seem to access the stream that is transmitted from the jetson orin nano…When I connect my computer to the same network as the jetson orin nano, and use VLC to access the stream (rtsp://IP:8554/stream-0) (IP is the IP address of the jetson orin nano), then the connection seems to be made but all that I see is a black screen…

This is the message that appears on the terminal when I connect tot the IP stream that is provided by the jetson orin nano:

0:01:02.994146630 6527 0xffff5c314580 WARN rtspstream rtsp-stream.c:4124:gst_rtsp_stream_get_rtpinfo: Could not get payloader stats
0:01:02.994504211 6527 0xffff5c314580 FIXME rtspmedia rtsp-media.c:4201:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
count = 356
0:01:03.033235646 6527 0xffff5c314580 FIXME rtspmedia rtsp-media.c:4201:gst_rtsp_media_suspend: suspend for dynamic pipelines needs fixing
0:01:03.033458950 6527 0xffff5c314580 WARN rtspmedia rtsp-media.c:4227:gst_rtsp_media_suspend: media 0xffff4c08a920 was not prepared
count = 357
count = 358
count = 359
count = 360
0:01:03.141684610 6527 0xffff5c314580 FIXME rtspclient rtsp-client.c:1818:handle_play_request:GstRTSPClient@0xfffe2c004a80 Add support for seek style (null)
0:01:03.141802662 6527 0xffff5c314580 FIXME rtspmedia rtsp-media.c:2711:gst_rtsp_media_seek_full:GstRTSPMedia@0xffff4c08a920 Handle going back to 0 for none live not seekable streams.
0:01:03.151605261 6527 0xfffe20058300 WARN basesink gstbasesink.c:1209:gst_base_sink_query_latency: warning: Pipeline construction is invalid, please add queues.
0:01:03.151668207 6527 0xfffe20058300 WARN basesink gstbasesink.c:1209:gst_base_sink_query_latency: warning: Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
0:01:03.151789172 6527 0xfffe20056800 WARN rtspmedia rtsp-media.c:3014:default_handle_message: 0xffff4c08a920: got warning Pipeline construction is invalid, please add queues. (gstbasesink.c(1209): gst_base_sink_query_latency (): /GstPipeline:media-pipeline/GstMultiUDPSink:multiudpsink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.)

(The count = 357…is print statements form the program that indicates a new frame that was processed)

If you put the output of x264enc in a file, will the file have the correct content?
You can refer to the x264enc/h264parse/qtmux/filesink in page to save the output in mp4 file.

Yes it does. I have set the pipeline to save the output, and it works well. But the problem is when the stream is sent over the network. If that happens, then the stream is very blurry. I have also tried different speed-preset values (speed-preset is one of the properties that you can set for the encoder) and I set the ‘pass’ property to 'pass1" and the ‘tune’ property is set to ‘zerolatency’.

Currently, the stream is most blurry when there is movement in the camera footage, then it takes a while for it to become normal again…What could be the cause of this?

Can you check the output of “sudo tegrastats” when the the program is streaming?

This is the output when the program is streaming:

11-09-2023 13:07:57 RAM 5001/6481MB (lfb 6x1MB) SWAP 247/3240MB (cached 0MB) CPU [56%@1510,48%@1510,51%@1510,52%@1510,50%@1512,60%@1510] EMC_FREQ 11%@2133 GR3D_FREQ 0%@[624,0] NVDEC 524 VIC_FREQ 0%@115 APE 200 CV0@-256C CPU@51.562C SOC2@50.531C SOC0@49.468C CV1@-256C GPU@50.125C tj@51.562C SOC1@49.468C CV2@-256C VDD_IN 7584mW/6386mW VDD_CPU_GPU_CV 3004mW/2170mW VDD_SOC 1805mW/1603mW

Is this message: “gstnvtracker: Unable to acquire a user meta buffer. Try increasing user-meta-pool-size” something to be worried about? Is this maybe the reason for the distorted stream?

Since your mp4 video is “blurry”, please provide your x264enc properties and parameters. What is the video resolution?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Can you capture a video to show us how blurry the RTSP stream is?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.