TK1 decoding of h264 stream fails after first image.

I have an application that works correctly on an x86_64 desktop yet fails to read multiple images on TK1.
The x86_64 pipeline is as follows:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! appsink name=mysink

And the TK1 pipeline is as follows:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! nvvidconv ! appsink name=mysink
I’ve also tried since apparently nvvidconv cannot produce a BGR output:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! videoconvert ! appsink name=mysink

On TK1 the first image is decoded, and then nothing more comes through.
The code subscribes to the “new-sample” callback from the appsink and the callback is as follows:

GstSample* sample = gst_base_sink_get_last_sample(GST_BASE_SINK(_appsink));
    // g_signal_emit_by_name(_appsink, "pull-sample", &sample, NULL);
    if (sample)
    {
        GstBuffer* buffer;
        GstCaps* caps;
        GstStructure* s;
        GstMapInfo map;
        caps = gst_sample_get_caps(sample);
        if (!caps)
        {
            MO_LOG(debug) << "could not get sample caps";
            return GST_FLOW_OK;
        }
        s = gst_caps_get_structure(caps, 0);
        gint width, height;
        gboolean res;
        res = gst_structure_get_int(s, "width", &width);
        res |= gst_structure_get_int(s, "height", &height);
        // const gchar* format = gst_structure_get_string(s, "format");
        if (!res)
        {
            MO_LOG(debug) << "could not get snapshot dimension\n";
            return GST_FLOW_OK;
        }
        buffer = gst_sample_get_buffer(sample);
        if (gst_buffer_map(buffer, &map, GST_MAP_READ))
        {
            cv::Mat mapped(height, width, CV_8UC3);
            memcpy(mapped.data, map.data, map.size);
            // push onto a queue to another thread for handling the data.
            m_data.enqueue({mapped, buffer->pts});
            /*image_param.updateData(mapped,
                                   mo::tag::_timestamp = mo::Time_t(buffer->pts * mo::ns),
                                   mo::tag::_context = mo::Context::getCurrent());*/
            sig_update();
        }
        gst_buffer_unmap(buffer, &map);
        gst_sample_unref(sample);
        gst_buffer_unref(buffer);
    }
    return GST_FLOW_OK;
}

Any suggestions to get this to work on my TK1?

Hi dtmoodie,
Please try

tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! nvvidconv ! 'video/x-raw,format=I420' ! videoconvert ! 'video/x-raw,format=BGR' ! appsink name=mysink

Thank you for looking at this, trying the above pipeline yields the following:

Successfully created pipeline: tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! nvvidconv ! video/x-raw,format=I420 ! videoconvert ! video/x-raw,format=BGR ! appsink name=mysink
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingNvMMLiteOpen : Block : BlockType = 261 
TVMR: NvMMLiteTVMRDecBlockOpen: 5107: NvMMLiteBlockOpen 
NvMMLiteBlockCreate : Block : BlockType = 261 
TVMR: cbBeginSequence: 584: BeginSequence  1920x1088, bVPR = 0
TVMR: cbBeginSequence: 826: DecodeBuffers = 2 
TVMR: cbBeginSequence: 850: Display Resolution : (1920x1080) 
TVMR: cbBeginSequence: 851: Display Aspect Ratio : (1920x1080) 
TVMR: cbBeginSequence: 1015: SurfaceLayout = 3
TVMR: cbBeginSequence: 1045: NumOfSurfaces = 6, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1 
Allocating new output: 1920x1088 (x 8), ThumbnailMode = 0

This also decodes the first frame and then nothing more.
I should mention:
I’m using gstreamer 1.0, not 0.1.
The following pipeline works with gst-launch-1.0:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! videoconvert ! nvhdmioverlaysink

The following pipeline works in my application albeit slowly:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw,format=BGR ! appsink name=mysink

So I can come to the conclusion that it is some kind of interaction with omxh264dec and either an appsink or the way I interact with the pipeline. I am however stumped with what I should do to fix it.

Hi dtmoodie,
The pipeline should be good for a file source.

filesrc location=a.mkv ! matroskademux ! h264parse ! omxh264dec ! nvvidconv ! 'video/x-raw,format=I420' ! videoconvert ! 'video/x-raw,format=BGR' ! appsink name=mysink

Can you save tcpclinetsrc to a file and give it a try?

Thank you for your feedback.

I’ve saved the output via:
tcpclientsrc host=192.168.0.99 port=80 ! filesink location=out.mkv

And played back data correctly on my x86 computer with:
“filesrc location=out.mkv ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! appsink name=mysink”

However on my TK1 with the pipeline that you suggested, I get the same error where only the first frame is decoded.

Hi dtmoodie,
Please share us the mkv so that we can reproduce the issue.

Thank you for your prompt reply:
https://drive.google.com/file/d/1KinJeacoh5Db6xrJo62ja2-viedKogGh/view?usp=sharing
I have not been able to test if this mkv is playable via gst-launch since I can only currently ssh into my TK1, however I would not be surprised if it did work correctly via that approach.

BTW, I’m at GPU tech conference, would it be possible to talk to someone directly about this?

Hi dtmoodie,

I download you out.mkv file, it can’t play on VLC.
Check the video file, it show:
Warning: Invalid or corrupted SegmentHeader master element

Please check your video file and attach again. Thanks!

Hello,

I’ve uploaded a new file. https://drive.google.com/open?id=1D0HN5M1I71ijvK0ldJOdKR-kWHhp_5aL
I’ve renamed the extension to .bin to signify that this may not be a well formed mkv file.
The file is simply a filesink dump of the data received from tcpclientsrc and it can be played on an x86 machine with the following pipeline:

gst-launch-1.0 filesrc location=out.bin ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! autovideosink

It CANNOT be played with vlc directly.

The stream is created by a raspberry pi running the following pipeline:
gst-launch-1.0 -vvv -e
v4l2src device=/dev/video0 !
video/x-h264,width=1920,height=1080,framerate=30/1 !
h264parse !
matroskamux streamable=true !
tcpserversink host=192.168.0.99 port=80

This stream is used since it can be natively played back in google chrome via an HTML5 video tag.

Hi dtmoodie,
Please eliminate the muxer/tcp protocol and give us the h264 stream so that we can check why it cannot be played by omxh264dec.

gst-launch-1.0 v4l2src device=/dev/video0 num-buffers=300 ! video/x-h264,width=1920,height=1080,framerate=30/1 ! h264parse ! video/x-h264,stream-format=byte-stream ! filesink location=a.h264

As requested, using the pipeline that you provided:

Hi dtmoodie,
I can decode a.h264 you uploaded on TK1:

$ gst-launch-1.0 filesrc location=a.h264 ! h264parse ! omxh264dec ! nvhdmioverlaysink

Im a.h264, only the first frame is IDR and the reset are P frames. For streaming, I would suggest have IDR frame every 30 or 60 frames. In case some frames are dropped, the decoding can continue from next IDR frame.

Hi DaneLLL,

Thanks for looking into this, I’ll see about changing my IDR settings, but I have no control of the h264 encoding so unless gstreamer can do some magic without decoding the stream, I doubt I can change that since that part of the pipeline is run on a raspberry pi. I could potentially look into moving my TK1 and removing the raspberry pi from the system.

Unfortunately I could also decode the stream externally from my application with the following pipeline on my TK1 (I mentioned this in my second post)

gst-launch-1.0 tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! videoconvert ! nvhdmioverlaysink

To me this indicates that there is a weird interaction that is present in my application when I use omxh264dec.