I want to decode H264NAL unit data to still image data

Hi.

Thank you for answering.
I think NvVideoDecoder processes the following.
Is this correct?

  • NvVideoDecoder has two “output plane buffer” and “capture plane buffer”.
  • NAL unit data is enqueued one after another in the “output plane buffer”.
  • NvVideoDecoder accumulates NAL unit data and executes decoding when it becomes decodable.
  • NvVideoDecoder queues the decoding results (YUV data) to the “capture plane buffer”.

And the job of the application is as follows.

  • Queue NAL unit data in order to NvVideoDecoder.
  • Before queuing, be sure to dequeue the previously queued data.
  • Monitor until it can dequeue from the capture plane buffer in a subthread.
  • Once dequeued, save it to a file.
  • Continue monitoring.

And the questions are as follows.

  • Queue NAL unit data in order to NvVideoDecoder.
  • Before queuing, be sure to dequeue the previously queued data.

Why is the above “dequeue processing” necessary?

thank you.

Hi,
The allocated buffers are re-used. So after dequeuing the buffer, please enqueue it back for putting next frame. This behavior is same in both capture and output planes.

Hi.

Thank you for answering.
My understanding of NvVideoDecoder has deepened.

I am using NvVideoDecoder to decode NAL unit data to RGBA data.
I feel that NvVideoDecoder has a very sophisticated interface.

On the other hand, I would like to implement decoding processing using NvVideoDecoder without using Jetson Multimedeia API.
In other words, I would like to implement a similar decoding process using “Gstreamer” instead of “Jetson Multimediaia API”.

You said at the beginning of this thread.

For a quick solution, please consider use gstreamer:
Accelerated GStreamer — Jetson Linux Developer Guide documentation 1

Even if you look at this guide, there is nothing like video_decode sample code.
Does Gstreamer also have an object like NvVideoDecoder?

My requests are as follows.

  • The application passes NAL unit data to Gstreamer one after another.
  • Gstreamer decodes when it becomes decodable (generates RGBA data)
  • Application obtains RGBA data in subthread etc.

thank you.

Hi,
The gstream plugins are implemented through jetson_multimedia_api. The plugins are public and you can download the source code and check:
https://developer.nvidia.com/embedded/jetson-linux-r3541

Driver Package (BSP) Sources

For your use-case, we would suggest use jetson_multiemdia_api. If you consider use gstreamer, you can implement the pipeline like:

appsrc ! h264parse ! nvv4l2decoder ! nvvidconv ! video/x-raw,format=RGBA ! appsink

In the pipeline, you can feed h264 stream to appsrc and get RGBA data in appsink

Please refer to the sample demonstrating appsrc and appsink:
Latency issue: nvv4l2h265enc accumulates four images before releasing the first - #3 by DaneLLL

Hi,

Please go through the page:
JetPack EULA :: NVIDIA JetPack Documentation

Please let us know if you have any concern. Thanks.

Hi.

I wrote a program to decode still image data from NAL unit data using Gstreamer.
But it didn’t work well.
Specifically, the new_sample callback is not called.

The outline of the program is as follows.

  • init(): Performs Gstreamer initialization processing.
  • sendH264Data(): Reads an H264 file containing enough NAL unit data to decode one still image data and push-buffers it.
  • Start Gstreamer main loop

That’s all.
I expected that when I started the Gstreamer main loop, the decoding process would immediately run and the new_sample callback would be called.
But it wasn’t possible.
I set a breakpoint in the new_sample function, but it never stopped.
When I ran almost the same code on Windows, it worked correctly.

The code is below.

#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
#include <sstream>

using namespace std;

GstElement* pipeline;
GstElement* appsrc;
GstElement* appsink;
GMainLoop* loop;


/**
 * new_sample callback
 * 
 */
static GstFlowReturn new_sample(GstAppSink *appsink, gpointer user_data)
{
	//--------------------
	// can't reach here !!
	//--------------------
	
    return GST_FLOW_OK;
}

/**
 * Read H264 file and send to Gstreamer
 */
bool sendH264Data() {

    GError* error = NULL;
    uint8_t naluBuf[16384];

    // File name generation
    char filename[64] = { 0 };
    sprintf(filename, "/home/ubuntu/work/testMovie/test_head12.h264");

    // File open
    FILE* h264File = fopen(filename, "rb");
    if (h264File == NULL) {
        return false;
    }

    // Get file size
    fseek(h264File, 0, SEEK_END);
    size_t h264DataSize = ftell(h264File);
    rewind(h264File);

    // Get data contents
    fread(&naluBuf[0], h264DataSize, 1, h264File);
    fclose(h264File);

    // push-buffer
    GstBuffer* buffer = gst_buffer_new_wrapped_full(GST_MEMORY_FLAG_READONLY, naluBuf, h264DataSize, 0, h264DataSize, NULL, NULL);
    g_signal_emit_by_name(appsrc, "push-buffer", buffer, &error);
    if (error != NULL) {
        return false;
    }
    
    return true;
}

/**
 * initialization function
 */
bool init() {
    gst_init(NULL, NULL);
    loop = g_main_loop_new(NULL, FALSE);

    GError* error = NULL;
	pipeline = gst_parse_launch("appsrc name=mysource ! decodebin ! videoconvert ! video/x-raw,format=RGBA ! appsink name=mysink", &error);
    if (error) {
        return false;
    }

    appsrc = gst_bin_get_by_name(GST_BIN(pipeline), "mysource");
    appsink = gst_bin_get_by_name(GST_BIN(pipeline), "mysink");
    if (appsrc == NULL || appsink == NULL) {
        return false;
    }

    // Register a callback when a new sample arrives
    GstAppSinkCallbacks callbacks = { NULL, NULL, new_sample, NULL };
    gst_app_sink_set_callbacks(GST_APP_SINK(appsink), &callbacks, NULL, NULL);

    gst_element_set_state(pipeline, GST_STATE_READY);
    gst_element_set_state(pipeline, GST_STATE_PLAYING);

    return true;
}


/**
 * main function
 * 
 */
int main(int argc, char** argv)
{
    // Initialization
    if (init() == false)
    {
        return -1;
    }
	
    // Read file and send data to Gstreamer
    if (sendH264Data() == false)
    {
        gst_element_set_state(pipeline, GST_STATE_NULL);
        gst_object_unref(GST_OBJECT(pipeline));
        return -1;
    }
	
    // Start Gstreamer main loop
	g_main_loop_run(loop);

    // clean up
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(pipeline));
    g_main_loop_unref(loop);

    g_print("going to exit \n");
    return 0;
}

What am I doing wrong?

thank you.

Hi,
You can run the pipeline in gst-launch-1.0 first to make sure it works:

$ gst-launch-1.0 filesrc ! decodebin ! videoconvert ! video/x-raw,format=RGBA ! fakesink

appsrc and appsink are not well supported in gst-launch-1.0, please replace with filesrc and fakesink

For further debugging, we suggest replace decodebin with h264parse ! nvv4l2decoder ! nvvidconv. To link the plugins one by one for a try.

Hi.
Thank you for your advice.

You can run the pipeline in gst-launch-1.0 first to make sure it works:

The execution results are as follows. It looks like the decoding is working correctly.

ubuntu@linux:~$ gst-launch-1.0 filesrc location=/home/ubuntu/work/testMovie/test_head12.h264 ! decodebin ! videoconvert ! video/x-raw,format=RGBA ! fakesink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Redistribute latency...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.139740096
Setting pipeline to NULL ...
Freeing pipeline ...

For further debugging, we suggest replace decodebin with h264parse ! nvv4l2decoder ! nvvidconv. To link the plugins one by one for a try.

	pipeline = gst_parse_launch("appsrc name=mysource ! h264parse ! nvv4l2decoder ! nvvidconv ! videoconvert ! video/x-raw,format=RGBA ! appsink name=mysink", &error);
    if (error) {
        return false;
    }

When I executed it as above, eroor contained the following message.

no element "nvv4l2decoder"

I checked from the terminal.

ubuntu@linux:~$ gst-inspect-1.0 | grep nvv
ubuntu@linux:~$

I think “nvv4l2decoder” is not installed.
How to install?

thank you.

Hi,
Please clean the cache and check if it appears:

$ rm .cache/gstreamer-1.0/registry.aarch64.bin

If it is still absent, we would suggest re-flash the system. It seems like all NVIDIA plugins are missing in your system.

Hi.

I ran rm.

ubuntu@linux:~$ rm .cache/gstreamer-1.0/registry.aarch64.bin 
ubuntu@linux:~$ gst-inspect-1.0 | grep nvv

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.389: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libcluttergst3.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.393: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvcompositor.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.397: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvidconv.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.432: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstqmlgl.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.444: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideo4linux2.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.450: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnveglglessink.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.463: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvv4l2camerasrc.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.468: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstgtk.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.688: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvegltransform.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.701: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvjpeg.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.726: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvarguscamerasrc.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.771: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstopengl.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.798: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideosink.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.810: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvvideosinks.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.834: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnveglstreamsrc.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block

(gst-plugin-scanner:78903): GStreamer-WARNING **: 18:07:52.846: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstnvivafilter.so': /lib/aarch64-linux-gnu/libGLdispatch.so.0: cannot allocate memory in static TLS block
ubuntu@linux:~$ gst-inspect-1.0 | grep nvv
ubuntu@linux:~$

An error occurred when inspecting immediately after executing rm.
What is this error?

When I inspected it again, nothing was found (and no errors).

If it is still absent, we would suggest re-flash the system. It seems like all NVIDIA plugins are missing in your system.

Hmm, I’m in trouble.
Reflashing the system is too disruptive to be possible.
Is it not possible to install only the NVIDIA plugin?

thank you.

Hi,
It is strange the files are missing. If you would like to use gstreamer, we would suggest re-falsh the system

For a try, you can find out the libs and copy to your system:
https://docs.nvidia.com/jetson/archives/r35.4.1/DeveloperGuide/text/SD/Multimedia/AcceleratedGstreamer.html#to-build-gstreamer-manually

The libs are listed in 13.

Hi.

thank you for the advice.
I noticed one thing.
When I switched to a different user and ran gst-inspect, the plugin was found.

ubuntu@linux:~/Desktop$ gst-inspect-1.0 | grep nvv
nvvideosinks:  nv3dsink: Nvidia 3D sink
nvvideosink:  nvvideosink: nVidia Video Sink
nvv4l2camerasrc:  nvv4l2camerasrc: NvV4l2CameraSrc
nvvideo4linux2:  nvv4l2av1enc: V4L2 AV1 Encoder
nvvideo4linux2:  nvv4l2vp9enc: V4L2 VP9 Encoder
nvvideo4linux2:  nvv4l2h265enc: V4L2 H.265 Encoder
nvvideo4linux2:  nvv4l2h264enc: V4L2 H.264 Encoder
nvvideo4linux2:  nvv4l2decoder: NVIDIA v4l2 video decoder
nvvidconv:  nvvidconv: NvVidConv Plugin

My Orin has two users: “ubuntu” and “oss”.
For ubuntu, you can find the plugin as shown above.
With oss, the plugin cannot be found.

How can I make the plugin visible in oss as well?

thank you.

Hi,
We are uncertain why this happens. Would see if other users can share experience.

Generally, we will create a default user in first booting and the user can run gstreamer commands to use our plugins.

Hi.

I don’t know if it’s related, but my Orin did the following:

  • Originally there was an “ubuntu” user.
  • I added the “oss” user.
  • Enabled “Automatic logon settings” for oss users.
  • Then Orin no longer starts.

We acknowledge that this issue is known.
I have recovered my system due to the above reasons.
I ran the command below to avoid the problem happening again.

$ sudo apt update
$ sudo apt install --reinstall gdm3 ubuntu-desktop gnome-shell

After that I added the oss user.
As a precaution, the automatic logon settings have not been changed.

Hi.

The code below is simple.
An h264 file is specified in the pipeline string.
When executed, new_buffer is called. This means that decoding is performed.

I would like to read the h264 file in the program and supply it to Gstreamer instead of specifying it in the pipeline.
How should I modify it?

You should just need to change the pipeline string from “filesrc location=test.h264” to “appsrc name=mysource”.
You should be able to decode without using “nvv4l2decoder”.
pipeline = gst_parse_launch(“appsrc name=mysource ! decodebin ! videoconvert ! video/x-raw,format=RGB ! appsink name=mysink”, &error);


#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
#include <stdint.h>
#include <fstream>

GstElement* pipeline;
GstElement* appsink;

static GstFlowReturn new_buffer(GstAppSink* sink, gpointer user_data)
{
	// Get the decoding result and save it to a bitmap file

    return GST_FLOW_OK;
}

int main(int argc, char* argv[])
{
    gst_init(&argc, &argv);
    GError* error = NULL;

    // Building a pipeline
    pipeline = gst_parse_launch("filesrc location=test.h264 ! decodebin ! videoconvert ! video/x-raw,format=RGB ! appsink name=mysink", &error);
    if (error != NULL) {
        return -1;
    }

    // Get AppSink element
    appsink = gst_bin_get_by_name(GST_BIN(pipeline), "mysink");
    if (appsink == NULL) {
        return -1;
    }

    // Register a callback when a new sample arrives
    GstAppSinkCallbacks callbacks = {NULL, NULL, new_buffer, NULL};
    gst_app_sink_set_callbacks(GST_APP_SINK(appsink), &callbacks, NULL, NULL);

    // Pipeline execution
    GstStateChangeReturn ret = gst_element_set_state(pipeline, GST_STATE_PLAYING);
    if (ret == GST_STATE_CHANGE_FAILURE) {
        return -1;
    }

    // Start of main loop
    GMainLoop* loop = g_main_loop_new(NULL, FALSE);
    g_main_loop_run(loop);

    // Pipeline cleanup
    gst_element_set_state(pipeline, GST_STATE_NULL);
    gst_object_unref(pipeline);

    g_main_loop_unref(loop);

    return 0;
}

Hi,
Please refer to feed_function() in the sample:
Latency issue: nvv4l2h265enc accumulates four images before releasing the first - #3 by DaneLLL

And modify it to feed 128 or 256 Kbytes of h264 stream in each call. It is supposed to work fine.

Hi,
Please try the sample:

#include <cstdlib>
#include <gst/gst.h>
#include <gst/gstinfo.h>
#include <gst/app/gstappsrc.h>
#include <gst/app/gstappsink.h>
#include <glib-unix.h>
#include <dlfcn.h>

#include <cstring>
#include <iostream>
#include <sstream>
#include <thread>

using namespace std;

#define USE(x) ((void)(x))

static GstPipeline *gst_pipeline = nullptr;
static string launch_string;
static GstElement *appsrc_;

GstClockTime timestamp = 0;
static guint size = 131072;
static FILE *fp = nullptr;

static void appsink_eos(GstAppSink * appsink, gpointer user_data)
{
    printf("app sink receive eos\n");
}

static GstFlowReturn new_buffer(GstAppSink *appsink, gpointer user_data)
{
    GstSample *sample = NULL;

    g_signal_emit_by_name (appsink, "pull-sample", &sample,NULL);

    if (sample)
    {
        GstBuffer *buffer = NULL;
        GstCaps   *caps   = NULL;
        GstMapInfo map    = {0};

        caps = gst_sample_get_caps (sample);
        if (!caps)
        {
            printf("could not get snapshot format\n");
        }
        gst_caps_get_structure (caps, 0);
        buffer = gst_sample_get_buffer (sample);
        gst_buffer_map (buffer, &map, GST_MAP_READ);

        printf("map.size = %lu\n", map.size);

        gst_buffer_unmap(buffer, &map);

        gst_sample_unref (sample);
    }
    else
    {
        g_print ("could not make snapshot\n");
    }

    return GST_FLOW_OK;
}

static gboolean feed_function(gpointer user_data) {
    GstBuffer *buffer;
    GstFlowReturn ret;
    GstMapInfo map = {0};
    size_t read_size = 0;

    buffer = gst_buffer_new_allocate (NULL, size, NULL);
    buffer->pts = timestamp;

    gst_buffer_map (buffer, &map, GST_MAP_WRITE);
    read_size = fread(map.data, size, 1, fp);
    gst_buffer_unmap(buffer, &map);

    g_signal_emit_by_name (appsrc_, "push-buffer", buffer, &ret);
    gst_buffer_unref(buffer);

    timestamp += 66666666;
    printf("fed one buffer \n");
    if (read_size == 0)
        return G_SOURCE_REMOVE;
    return G_SOURCE_CONTINUE;
}

int main(int argc, char** argv) {
    USE(argc);
    USE(argv);

    gst_init (&argc, &argv);

    GMainLoop *main_loop;
    main_loop = g_main_loop_new (NULL, FALSE);
    ostringstream launch_stream;
    GstAppSinkCallbacks callbacks = {appsink_eos, NULL, new_buffer};

    launch_stream
    << "appsrc name=mysource ! "
    << "h264parse ! nvv4l2decoder ! "
    << "nvvidconv ! video/x-raw,format=NV12 ! "
    << "appsink name=mysink sync=0 ";

    launch_string = launch_stream.str();

    g_print("Using launch string: %s\n", launch_string.c_str());

    GError *error = nullptr;
    gst_pipeline  = (GstPipeline*) gst_parse_launch(launch_string.c_str(), &error);

    if (gst_pipeline == nullptr) {
        g_print( "Failed to parse launch: %s\n", error->message);
        return -1;
    }
    if(error) g_error_free(error);

    appsrc_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysource");
    gst_app_src_set_stream_type(GST_APP_SRC(appsrc_), GST_APP_STREAM_TYPE_STREAM);

    fp = fopen ("/usr/src/jetson_multimedia_api/data/Video/sample_outdoor_car_1080p_10fps.h264", "rb");
    
    GstElement *appsink_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysink");
    gst_app_sink_set_callbacks (GST_APP_SINK(appsink_), &callbacks, NULL, NULL);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING); 

    gboolean ret = G_SOURCE_CONTINUE;
    while (ret == G_SOURCE_CONTINUE) {
        ret = feed_function(nullptr);
        usleep(66666);
    }
    gst_app_src_end_of_stream(GST_APP_SRC(appsrc_));
    g_print("wait for EoS \n");
    GstMessage *msg;
    GstBus *bus = gst_pipeline_get_bus(GST_PIPELINE(gst_pipeline));
    msg = gst_bus_poll(bus, GST_MESSAGE_EOS, GST_CLOCK_TIME_NONE);
    gst_message_unref(msg);
    gst_object_unref(bus);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(gst_pipeline));
    g_main_loop_unref(main_loop);

    fclose(fp);
    g_print("going to exit \n");
    return 0;
}

$ g++ -Wall -std=c++11  sample.cpp -o test $(pkg-config --cflags --libs gstreamer-app-1.0)
$ ./test

Hi.

Thank you for providing the sample code!
I tried that, but I still get the error “no element “nvv***””.

“msg = gst_bus_poll(bus, GST_MESSAGE_EOS, GST_CLOCK_TIME_NONE);”
Once this code is executed, no processing will be returned.
Of course, “new_buffer” is never executed.

I modified the pipeline string as below.

Change before:

     launch_stream
     << "appsrc name=mysource ! "
     << "h264parse ! nvv4l2decoder ! "
     << "nvvidconv ! video/x-raw,format=NV12 ! "
     << "appsink name=mysink sync=0 ";

After change:

     launch_stream
     << "appsrc name=mysource ! "
     << "h264parse ! decodebin ! "
     << "videoconvert ! video/x-raw,format=NV12 ! "
     << "appsink name=mysink sync=0 ";

When I ran this, “new_buffer” was executed!

I’m now going to check if “new_buffer” works as expected.
(Sorry, I will close the thread after confirming that it works as expected)

thank you.

Hi,
We suggest try to run on AGX Orin developer kit and make sure the sample works. And investigate why you cannot use hardware decoder in your environment. Maybe it is better to re-flash the environment.

The default sample shall work well on all Jetson platforms in Jetpack 4 and 5. Suggest you check why it cannot work well in your environment.