I’m developing a camera using gstreamer as video pipeline with Jetson Nano platform. The pipeline likes this:
- camera pipeline: nvarguscamerasrc → interpipesink name=camsrc
- liveview pipeline: interpipesrc listen-to=camsrc → nvviconv scale down → nvv4l2vp8enc → rtpvp8pay → udpsink
- 1st profile pipeline: interpipesrc listen-to=camsrc → nvv4l2h264enc → rtph264pay ----> rtsp stream mount /video1
- 2nd profile pipeline: interpipesrc listen-to=camsrc → nvviconv scale → nvv4l2h264enc → rtph264pay ----> rtsp stream mount /video2
- and so on
Things work ok if the camera and liveview pipelines run as PLAYING state from starting and never stop. But this will causes high power consumption when camera is idle (no connection to).
My question is, what is the good way to stop the pipelines (include stop reading image sensor & isp - nvarguscamerasrc element) when camera idle and back again when a RTSP client connects to?
Thank you for reading & sharing.
In general we launch a RTSP server through test-launch. It is a simple sample based on rtsp media factory
For launching a single pipeline you can run like:
nvarguscamerasrc! nvv4l2h264enc ! h264parse ! rtph264pay name=pay0 pt=96
We have reference steps in Jetson Nano FAQ
Your use-case is complicated. One possible solution is to register a signal to inform you that clients are connected so that you can launch the camera pipeline. Signals are listed in rtsp media factory
Thank you @DaneLLL .
It looks like my problem is mostly about nvvidconv with RGBA format, not with interpipe connections. I created test pipelines below:
livepipe = GST_PIPELINE (gst_parse_launch("interpipesrc listen-to=camsrc is-live=true allow-renegotiation=true stream-sync=restart-ts ! queue max-size-buffers=3 leaky=downstream"
" ! nvvidconv ! video/x-raw(memory:NVMM),format=I420,width=1280,height=720 "
" ! videorate ! video/x-raw(memory:NVMM),framerate=10/1 "
" ! nvv4l2vp8enc bitrate=1000000 ! rtpvp8pay pt=96 ! udpsink host=127.0.0.1 port=5004", &error));
campipe = GST_PIPELINE (gst_parse_launch("nvarguscamerasrc ! video/x-raw(memory:NVMM),width=1932,height=1090,format=NV12,framerate=30/1 "
" ! nvvidconv left=6 right=1926 top=6 bottom=1086 ! video/x-raw(memory:NVMM),width=1920,height=1080,format=RGBA "
" ! nvvidconv ! video/x-raw(memory:NVMM),format=NV12 "
" ! queue leaky=downstream max-size-buffers=3 "
" ! interpipesink name=camsrc caps=video/x-raw(memory:NVMM),format=NV12 sync=true async=false", &error));
When both pipelines are in PLAYING state, I set livepipe’s state to NULL, wait for 10 seconds and then set it back to PLAYING, a segmentation fault occurs. Using gdb:
[Thread 0x7f7dc061e0 (LWP 7194) exited]
[Thread 0x7f7f7fe1e0 (LWP 7191) exited]
[Thread 0x7f7e4071e0 (LWP 7193) exited]
[Thread 0x7f7ec081e0 (LWP 7192) exited]
Liveview state 1
[Thread 0x7f9affd1e0 (LWP 7171) exited]
[Thread 0x7f9b7fe1e0 (LWP 7170) exited]
Failed to query video capabilities: Inappropriate ioctl for device
Opening in BLOCKING MODE
Opening in BLOCKING MODE
[New Thread 0x7f9affd1e0 (LWP 7198)]
NvMMLiteOpen : Block : BlockType = 7
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 7
[New Thread 0x7f9b7fe1e0 (LWP 7199)]
[New Thread 0x7f7e4071e0 (LWP 7200)]
[Thread 0x7f7e4071e0 (LWP 7200) exited]
[Thread 0x7fad1c81e0 (LWP 7125) exited]
[Thread 0x7f98d831e0 (LWP 7186) exited]
Thread 19 "consumer_thread" received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x7f9b7fe1e0 (LWP 7199)]
0x0000007fb7ef023c in gst_mini_object_set_qdata () from /usr/lib/libgstreamer-1.0.so.0
#0 0x0000007fb7ef023c in gst_mini_object_set_qdata () from /usr/lib/libgstreamer-1.0.so.0
#1 0x0000007fb42be928 in ?? () from /usr/lib/gstreamer-1.0/libgstnvarguscamerasrc.so
#2 0x0000007fb7d4c014 in ?? () from /usr/lib/libglib-2.0.so.0
#3 0x0000007fb7cab394 in start_thread (arg=0x7fad9c8506) at pthread_create.c:477
#4 0x0000007fb7c06ddc in thread_start () at ../sysdeps/unix/sysv/linux/aarch64/clone.S:78
But if I only use NV12 format in my campipe, things are fine, I can stop livepipe and campipe then set them to PLAYING again without issues.
I need RGBA format at that position for using our own plugins to draw OSD, for now OSD APIs only support RGBA format.
Any ideas? Thank you.
PS: I’m using JP 4.5.2
Seems like segment fault happens in libgstnvarguscamerasrc.so. Since nvarguscamerasrc plugin is open source, you can try to add debug prints and rebuild/replace the prebuilt lib to get further information.
Please download the source code from:
L4T Driver Package (BSP) Sources
Thank you for your reply.
I actually found that the reason is DMA buffers was created by nvv4l2h264enc are still remain when the pipeline goes to idle state. But the buffers are free if the pipeline goes to null state, then I have updated my code to change all pipelines into NULL state when idle. That solves my problem.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.