I have an application that works correctly on an x86_64 desktop yet fails to read multiple images on TK1.
The x86_64 pipeline is as follows:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! appsink name=mysink
And the TK1 pipeline is as follows:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! nvvidconv ! appsink name=mysink
I’ve also tried since apparently nvvidconv cannot produce a BGR output:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! videoconvert ! appsink name=mysink
On TK1 the first image is decoded, and then nothing more comes through.
The code subscribes to the “new-sample” callback from the appsink and the callback is as follows:
GstSample* sample = gst_base_sink_get_last_sample(GST_BASE_SINK(_appsink));
// g_signal_emit_by_name(_appsink, "pull-sample", &sample, NULL);
if (sample)
{
GstBuffer* buffer;
GstCaps* caps;
GstStructure* s;
GstMapInfo map;
caps = gst_sample_get_caps(sample);
if (!caps)
{
MO_LOG(debug) << "could not get sample caps";
return GST_FLOW_OK;
}
s = gst_caps_get_structure(caps, 0);
gint width, height;
gboolean res;
res = gst_structure_get_int(s, "width", &width);
res |= gst_structure_get_int(s, "height", &height);
// const gchar* format = gst_structure_get_string(s, "format");
if (!res)
{
MO_LOG(debug) << "could not get snapshot dimension\n";
return GST_FLOW_OK;
}
buffer = gst_sample_get_buffer(sample);
if (gst_buffer_map(buffer, &map, GST_MAP_READ))
{
cv::Mat mapped(height, width, CV_8UC3);
memcpy(mapped.data, map.data, map.size);
// push onto a queue to another thread for handling the data.
m_data.enqueue({mapped, buffer->pts});
/*image_param.updateData(mapped,
mo::tag::_timestamp = mo::Time_t(buffer->pts * mo::ns),
mo::tag::_context = mo::Context::getCurrent());*/
sig_update();
}
gst_buffer_unmap(buffer, &map);
gst_sample_unref(sample);
gst_buffer_unref(buffer);
}
return GST_FLOW_OK;
}
This also decodes the first frame and then nothing more.
I should mention:
I’m using gstreamer 1.0, not 0.1.
The following pipeline works with gst-launch-1.0:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! omxh264dec ! videoconvert ! nvhdmioverlaysink
The following pipeline works in my application albeit slowly:
tcpclientsrc host=192.168.0.99 port=80 ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! video/x-raw,format=BGR ! appsink name=mysink
So I can come to the conclusion that it is some kind of interaction with omxh264dec and either an appsink or the way I interact with the pipeline. I am however stumped with what I should do to fix it.
And played back data correctly on my x86 computer with:
“filesrc location=out.mkv ! matroskademux ! h264parse ! avdec_h264 ! videoconvert ! appsink name=mysink”
However on my TK1 with the pipeline that you suggested, I get the same error where only the first frame is decoded.
I’ve uploaded a new file. https://drive.google.com/open?id=1D0HN5M1I71ijvK0ldJOdKR-kWHhp_5aL
I’ve renamed the extension to .bin to signify that this may not be a well formed mkv file.
The file is simply a filesink dump of the data received from tcpclientsrc and it can be played on an x86 machine with the following pipeline:
The stream is created by a raspberry pi running the following pipeline:
gst-launch-1.0 -vvv -e
v4l2src device=/dev/video0 !
video/x-h264,width=1920,height=1080,framerate=30/1 !
h264parse !
matroskamux streamable=true !
tcpserversink host=192.168.0.99 port=80
This stream is used since it can be natively played back in google chrome via an HTML5 video tag.
Im a.h264, only the first frame is IDR and the reset are P frames. For streaming, I would suggest have IDR frame every 30 or 60 frames. In case some frames are dropped, the decoding can continue from next IDR frame.
Thanks for looking into this, I’ll see about changing my IDR settings, but I have no control of the h264 encoding so unless gstreamer can do some magic without decoding the stream, I doubt I can change that since that part of the pipeline is run on a raspberry pi. I could potentially look into moving my TK1 and removing the raspberry pi from the system.
Unfortunately I could also decode the stream externally from my application with the following pipeline on my TK1 (I mentioned this in my second post)