Nvarguscamerasrc stops to send data during operation

Hi

We are using nvarguscamerasrc in our gstreamer pipeline. The pipeline itself is very simple:
nvarguscamerasrc sensor-id=<DCL_SENSOR_ID> sensor-mode=0 gainrange=“1 16” ispdigitalgainrange=“1 1”
! video/x-raw(memory:NVMM), width=(int)<DCL_CAPTURE_WIDTH>, height=(int)<DCL_CAPTURE_HEIGHT>, format=(string)NV12, framerate=(fraction)<DCL_CAPTURE_FRAMERATE>/1
! nvvidconv ! textoverlay name=text_overlay ! video/x-raw,format=I420
! nvvidconv ! nvv4l2vp8enc bitrate=<DCL_RECORD_VIDEO_BITRATE> control-rate=1 ! rtpvp8pay mtu=1400 ! udpsink auto-multicast=true clients=<DCL_UDP_SINK_CLIENTS>

We are only adding a dynamic text overlay to the video and sending it trough UDP as VP8 RTP stream. We are starting the pipeline from a simple c++ application but during operation we only update the text of the overlay every half second. And we run two of these services on the Jetson Nano

After a few hours the receiving end do not get any video stream and if we check the sender the application processor usage is dropped to almost nothing (during regular operation each of them use ~50% of a CPU core). This is only happening with one of the services but it pretty random which one is failing sensor 1 or sensor 2.

We checked the usual places like syslog, dmesg and the output of nvargus-daemon but there is no error indication.

We are monitoring the resource usage but there is no major thing there, we use barely no memory and its stay constant during the most significatn resource usage is the applications themselves with their 50% CPU usage and like 1.5% of memory usage.

Do you have any idea where we should look to continue the investigation (which log files should we check or the output of some command)? Do you bundle any specific monitoring for such cases?

Also is there a way to increase the log level of nvargus-daemon to see more details waht is going on?

Thank you!

Bests,
Peter

hello pettair,

could you please further narrow down the use-case, is camera stream functional locally?
for example,
please put camera to preview camera stream as following,
$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvoverlaysink -ev

you may enable a terminal, and putting nvargus-daemon to foreground to show more logs.
for example,

$ sudo pkill nvargus-daemon
$ sudo nvargus-daemon

Hi Jerry,

Yes the camera is functional, we are able to receive the realtime view on the other end. When the issue happens I think our application still uses the sensor, but I tried to execute (now the issue happened with the sensor on index 1):

gst-launch-1.0 nvarguscamerasrc sensor-id=1\ ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12'\ ! nvvidconv ! omxh264enc ! filesink name=file_sink location=test.mp4

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
H264: Profile = 66, Level = 40 
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:568 Failed to create CaptureSession
Got EOS from element "pipeline0".
Execution ended after 0:00:00.028551220
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

In the output of:
journalctl -u nvargus-daemon

I see:
nvargus-daemon[14671]: === gst-launch-1.0[15223]: Connection established (7EEB7FE1D0)=== gst-launch-1.0[15223]: CameraProvider initialized (0x7f0c003d60)(Argus) Error AlreadyAllocated: Device 0 (of 1) is in use (in src/api/CameraProviderImpl.cpp, function createCaptureSession(), line 236)

After I stop our application I can start this pipeline and it creates the output file just fine.

Also the streaming in the other interface (sensor index 0) is working fine all the time.

I will try to execute the nvargus-daemon command you sent to check if it says something more.

Bests,
Peter

Hi Jerry,

I tried to execute our testcase with the following following

$ sudo pkill nvargus-daemon
$ sudo nvargus-daemon

But there were no error indication in the output. So we are still trying to verify if the issue is in our application code or we stop receiving data from the camera.

Is there any command or gstreamer callback what we could use to verify if we still receive frames from the ‘nvarguscamerasrc’?

Like we already using ‘g_signal_connect’ to print out events arriving to the pipeline, is there any similar thing to print nvarguscamerasrc stats from within the applications?

Thanks!

Bests,
Peter

hello pettair,

may I also know which JetPack / L4T release version you’re working with,
you should check release tag for the details, i.e. $ cat /etc/nv_tegra_release
thanks

Hi Jerry,

We are on 4.4:

$ cat /etc/nv_tegra_release
# R32 (release), REVISION: 4.4, GCID: 23942405, BOARD: t210ref, EABI: aarch64, DATE: Fri Oct 16 19:44:43 UTC 2020

Bests,
Peter

hello pettair,

you may include the silent options to generate verbose output.
it’ll show the frame-number and timestamps for each capture frames if you enable the verbose output.
for example,

$ gst-launch-1.0 nvarguscamerasrc silent=false sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=I420' ! fakesink
...

CONSUMER: Acquired Frame: 70, time 603600870341000
CONSUMER: Acquired Frame: 71, time 603600903654000
CONSUMER: Acquired Frame: 72, time 603600937041000
CONSUMER: Acquired Frame: 73, time 603600971120000
CONSUMER: Acquired Frame: 74, time 603601003998000
CONSUMER: Acquired Frame: 75, time 603601037110000

it’s by default to print for each capture frame.
you may revise the sources if you would like to adjust the output.
for example,
gst-nvarguscamera/gstnvarguscamerasrc.cpp

      if (!src->silent)
      {
        guint64 frame_timestamp = iFrame->getTime() - ground_clk;
        guint64 millisec_timestamp = ((frame_timestamp % (1000000000)))/1000000;
        CONSUMER_PRINT("Acquired Frame: %llu, time sec %llu msec %llu\n",
                   static_cast<unsigned long long>(iFrame->getNumber()),
                   static_cast<unsigned long long>(frame_timestamp / (1000000000)),
                   static_cast<unsigned long long>(millisec_timestamp));
      }
1 Like

Hi Jerry,

Thanks! With this we were able to verify that the nvarguscamerasrc is receiving frames, so the issue must be within our pipeline. Somewhere in the textoverlay.

Bests,
Peter