• Hardware Platform (Jetson / GPU):Jetson Nano • DeepStream Version:6.0.1 • JetPack Version (valid for Jetson only):4.6.1 • TensorRT Version:default • Issue Type( questions, new requirements, bugs):questions
Thanks for your reading.
I want to feed the openCV mat to the deepstream pipeline frame by frame.
To complete the purpose, I chose to build a gstreamer pipeline reference to the deepstream-appsrc-test sample in C++. Did I choose the wrong sample?
After compiling the sample successfully, a segmentation fault occurs when I run the deepstream-appsrc-test, as shown in the below picture.
The mistaken parameters of deepstream-appsrc-test could be the error reason.
Can anyone help me point out the correct parameters and correct running process for this example? Which gstreamer tutorials should I refer to if I want to keep working on passing openCV mat to the deepstream pipeline?
Thank you very much for your patience, and I am looking forward to your reply.
Thanks for your reply.
I have tried the command ./deepstream-appsrc-test 1.yuv 1280 720 1 I420.
However, the content of the video output window is very strange, and soon the process crashes with the error shown in the figure below.
So, I assume the first parameter 1.yuv is the custom input src stream.
I have tried the various combinations of sample_720p.jpg and sample_720p.mp4 as input.
But after a mosaic video playback window flashed, the process prompt ended.
1 Here is whole command description, you need to input I420/NV12/RGBA raw filename, raw file’s width, raw file’s height, raw file’s format.
./deepstream-appsrc-test -h
Usage: ./deepstream-appsrc-test <format(I420, NV12, RGBA)>
for example, you can start with ./deepstream-appsrc-test 1.yuv 1280 720 1 I420. 1.yuv (1.3 MB)
2 please refer readme for more detail, path: /opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-appsrc-test/