How to multi-process on the TX2

Hi all,

I’d like to multi-process on the TX2.

  1. capturing and Rendering for overlay image.

  2. Save image to disk (sd-card mmc).

  3. Output to server transffered image.

I want to know how to process above three jobs using gstreamer on the TX2 simultaneaously.

Is it possible?

Could you let me know your idea.

Thanks & BR,

This multi-process is need in order to send picture of the Drone to server in real-time.

If this solution is exist, please let me know your idea.

Thanks & BR,

Hi nwlee, please see this thread about GStreamer tee / nvtee element, which allows you to split data to multiple destinations
(i.e. simultaneous filesink and RTP/RTSP streaming):

https://devtalk.nvidia.com/default/topic/915916/jetson-tx1/simultaneous-camera-capture-and-file-saving-/

Hi dusty,

I saw your link contents. but, this contents are likely to based on nvcamerasrc.

I want to know gst command based v4l2src because mipi driver is based on v4l2src.

What is difference nvcamerasrc to v4l2src? I don’t know difference both exactly.

Anyway, gst command that I used as below.

===================================================================================================

<EO/IR devices capturing and rendering>

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, width=1920, height=1080, framerate=60/1, format=UYVY’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=60/1, format=I420’ ! nvoverlaysink sync=false

gst-launch-1.0 v4l2src device=/dev/video1 ! ‘video/x-raw, width=1920, height=1080, framerate=60/1, format=UYVY’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), width=1920, height=1080, framerate=60/1, format=I420’ ! nvoverlaysink sync=false

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=1920,height=1080,format=UYVY ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! nvoverlaysink overlay-w=1920 overlay-h=1080 overlay=1 sync=false & gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,width=640,height=480,format=UYVY ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! nvoverlaysink overlay-w=640 overlay-h=480 overlay=2 sync=false

mkdir sdcard

sudo mount /dev/mmcblk1p1 /home/nvidia/sdcard

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, width=1920, height=1080, framerate=60/1, format=UYVY’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=I420’ ! queue ! omxh264enc bitrate=8000000 ! h264parse ! matroskamux ! filesink location=/home/nvidia/sdcard/EO_test_1207.mp4 -e

gst-launch-1.0 v4l2src device=/dev/video1 ! ‘video/x-raw, width=1920, height=1080, framerate=30/1, format=UYVY’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=I420’ ! queue ! omxh264enc bitrate=8000000 ! h264parse ! matroskamux ! filesink location=/home/nvidia/sdcard/IR_test_1207.mp4 -e

sudo umount /home/nvidia/sdcard

  • tx2 part -

gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, width=1920, height=1080, framerate=60/1, format=UYVY’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=I420’ ! queue ! omxh264enc bitrate=20000000 ! ‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink port=5000 async=false sync=false host=192.168.1.241

gst-launch-1.0 v4l2src device=/dev/video1 ! ‘video/x-raw, width=1920, height=1080, framerate=60/1, format=UYVY’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=I420’ ! queue ! omxh264enc bitrate=20000000 ! ‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink port=5000 async=false sync=false host=192.168.1.241

  • server part -

gst-launch-1.0 udpsrc port=5000 caps=“application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES” ! rtpbin ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink sync=false -vvv -e

===================================================================================================

Therefore, I’d like to know working way to all-in-one not each simutaneously above commands.

Thank you & Best Regards,

Hi nwlee,

nvcamerasrc is for csi based bayer sensor and v4l2src is for usb or yuv sensor. The source does not influence what dusty_nv provides. Using “nvtee” can help split data to multiple destinations.

Just replace the source to what you need.

Hi,

Thank you for your support.

But, I want to know example for v4l2src not nvcamerasrc becuase I have confused.

And, I’d like to know how to cptureing or rendering for camera image transfer to server with RTP or RTSP(that is, encoding and decoding) as well as camera capture and saving simutaneously.

So, Could you let me know above contents?

Thanks & BR,

Hi nwlee,

For rtsp pipeline, please refer to
https://devtalk.nvidia.com/default/topic/1018689/jetson-tx2/vlc-playing-gstreamer-flow/post/5187270/#5187270

As for nvtee, just change following pipeline from nvcamerasrc to v4l2src.

This pipeline has two output sink, one is a file named a.h264.t and another is nvoverlaysink which renders to display.

gst-launch-1.0 nvcamerasrc num-buffers=150 ! tee name=t t. ! queue ! omxh264enc ! filesink location=a.h264 t. ! queue ! nvtee ! nvoverlaysink

I have progressed working with post #4

Please refer to post #4 above.

I have used v4l2src format. So, I don’t know nvcamerasrc format.

Also, receiving format with server is gstreamer not vlc.

Therefore, Would you let me know method applying all-in-one command for v4l2src and receiving format gstreamer with server.

Please give me a command or source for simultaneous stream directly.

Thanks & BR,

Hi nwlee,

Sorry that I misunderstood your problem. Below is what I know about your request, please confirm.

gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=1920, height=1080, framerate=60/1, format=UYVY' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=I420' ! queue ! omxh264enc bitrate=20000000 ! 'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink port=5000 async=false sync=false host=192.168.1.241 gst-launch-1.0 udpsrc port=5000 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES" ! rtpbin ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! xvimagesink sync=false -vvv -e

If I understand correctly, you can use nvtee for following usecase and merge them in one pipelime.

<nvoverlaysink>
<save to disk>
<tx2 as a server>

Hi wayne,

Yes, That’s right!

I want to know that + + with nvtee command in one pipeline.

Is it possible?

Please let me know your idea.

Thanks & BR,

Hi nwlee,

In fact, you can refer to the previous pipeline as a sample and try to modify.
(Sorry for the typo, it should be “tee” instead of “nvtee”)

gst-launch-1.0 nvcamerasrc num-buffers=150 ! tee name=t t. ! queue ! omxh264enc ! filesink location=a.h264 t. ! queue ! nvtee ! nvoverlaysink

As you can see we add a tee name “t.”, so this pipeline becomes two. One is to the filesink and the other(after “t.”) is to the nvoverlaysink. Please try to modify “nvcamerasrc” to “v4l2src” and add relevant parameters of v4l2src.

Hi Wayne,

I’ve succeeded that was combined both filesink and overlaysink as below command.

gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=1920,height=1080,format=UYVY ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! tee name=t t. ! queue ! omxh264enc bitrate=8000000 ! h264parse ! matroskamux ! filesink location=/home/nvidia/sdcard/EO_image.mp4 t. ! queue ! nvtee ! nvoverlaysink overlay-w=1920 overlay-h=1080 overlay=1 sync=false & gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,width=640,height=480,format=UYVY ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! tee name=t t. ! queue ! omxh264enc bitrate=8000000 ! h264parse ! matroskamux ! filesink location=/home/nvidia/sdcard/IR_image_test.mp4 t. ! queue ! nvtee ! nvoverlaysink overlay-w=640 overlay-h=480 overlay=2 sync=false

Thank you for your support.

But, overlayed image is not saved in the same mp4 file.

Will be saved overlayed image together in the same file?

Thanks & BR,

Hi nwlee,

What do you mean “overlayed image is not saved in the same mp4 file”? What is your expectation?

My expectation is working streaming overlayed movie together on one mp4 file.

I want to know just whether that is possible or not.

EO and IR movie is saved each mp4 file.

And, I have one more question.

I’d like to add udpsink with nvtee.

Would you check my command as below?

====================================================================================================
gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=1920,height=1080,format=UYVY ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! tee name=t t. ! queue ! omxh264enc bitrate=8000000 ! h264parse ! matroskamux ! filesink location=/home/nvidia/sdcard/EO_image.mp4 t. ! queue ! nvtee ! nvoverlaysink overlay-w=1920 overlay-h=1080 overlay=1 sync=false t. ! queue ! nvtee ! omxh264enc bitrate=20000000 ! ‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink port=5000 async=false sync=false host=192.168.1.237 & gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,width=640,height=480,format=UYVY ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=NV12’ ! tee name=t t. ! queue ! omxh264enc bitrate=8000000 ! h264parse ! matroskamux ! filesink location=/home/nvidia/sdcard/IR_image.mp4 t. ! queue ! nvtee ! nvoverlaysink overlay-w=640 overlay-h=480 overlay=2 sync=false t. ! queue ! nvtee ! omxh264enc bitrate=20000000 ! ‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse ! mpegtsmux ! rtpmp2tpay ! udpsink port=5000 async=false sync=false host=192.168.1.237

I think the src element is no problem. but, sink element is generated error as below.

====================================================================================================
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-linked (-1)
EOS on shutdown enabled – waiting for EOS after Error
Waiting for EOS…
ERROR: from element /GstPipeline:pipeline0/GstH264Parse:h264parse0: No valid frames found before end of stream
Additional debug info:
gstbaseparse.c(1153): gst_base_parse_sink_event_default (): /GstPipeline:pipeline0/GstH264Parse:h264parse0

Please let me know your idea.

Thanks & Best Regards,

Hi Wayne,

Could you please support for contents of the post #15 ?

Thanks & BR,

Continue in https://devtalk.nvidia.com/default/topic/1027500/