Take pictures with CSI camera on TX2

Hi,

I need help from you how to take and save pictures from 4K CSI cameras (Leopard IMX274M12) on TX2. This Leopard camera board can stream six 4K cameras at 30fps simultaneously on TX2, but I only use two 4K cameras.

I have external trigger on GPIO pin that makes 14 triggers/second and for every trigger I need to take picture for both cameras and save it on SSD in JPG format in 4K resolution.

For now I can only get ~7 pictures/second per camera using python, opencv and gstreamer.
I experiment with different gstreamer settings and this is best that I made so far.
I turn on ./jetson_clocks.sh for max performances and when I start my script all my cores are on 100%.
I also check SSD speed and they are good and can handle this writing speed.
I tried with multiple threads in python for taking pictures, speed is better than ~11 pictures/second but very soon my RAM is full and my program crashes, and now I need help from you for some new ideas how to do that.

This is my code:

from sysfs.gpio import Controller, OUTPUT, INPUT, RISING
import cv2
import numpy as np
import threading

#variables
i = 0
compression = 70

#4K resolution for camera
width = 3840
height = 2160

#GPIO
camTrigger = False
Controller.available_pins = [397]
pin = Controller.alloc_pin(397, INPUT)

#camera 1 
gst_str0 = ("nvcamerasrc  sensor-id=1 fpsRange='15 15' ! video/x-raw(memory:NVMM), width=(int){}, height=(int){}, format=(string)I420, framerate=(fraction)30/1 ! nvvidconv ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink").format(width, height)
cap0 = cv2.VideoCapture(gst_str0, cv2.CAP_GSTREAMER)

#camera 2
gst_str1 = ("nvcamerasrc  sensor-id=2 fpsRange='15 15' ! video/x-raw(memory:NVMM), width=(int){}, height=(int){}, format=(string)I420, framerate=(fraction)30/1 ! nvvidconv ! video/x-raw, format=(string)BGRx ! videoconvert ! video/x-raw, format=(string)BGR ! appsink").format(width, height)
cap1 = cv2.VideoCapture(gst_str1, cv2.CAP_GSTREAMER)

#every time GPIO trigger is occured save picture
def savePic():
   while True:
      _, displayBuf1 = cap0.read()
      _, displayBuf2 = cap1.read()

      if camTrigger == True:
         cv2.imwrite("/media/nvidia/ssd/" + str(i) + ".jpg", displayBuf1)
         cv2.imwrite("/media/nvidia/ssd/" + str(i) + ".jpg", displayBuf2)

cam = threading.Thread(target = savePic)
cam.start()

#watch for GPIO triggers
while True:
   state = pin.read()
   if state == 1 and camTrigger == False:
      camTrigger = True
      i = i + 1
   elif state == 0:
      camTrigger = False

cap0.release()
cap1.release()

Thank you

Could you try the Argus sample yuvJpeg

Hi ShaneCCC,

I tried to run Argus sample yuvJpeg but no luck I got this error

nvidia@tegra-ubuntu:~/argus/samples/yuvJpeg$ g++ main.cpp -o main
main.cpp:29:19: fatal error: Error.h: No such file or directory
compilation terminated.

Also I tried "argus_camera --device=0” in terminal and cameras are working fine, and I can capture images, but I still don’t now how to use it to fix my problem.
Is there any way to use argus in python to take pictures?

Please, Check the README to build it.

Hi,
Yes I read that file then, but I was stuck on this part:

$ sudo dpkg -i cuda-repo-<distro>_<version>_amd64.deb
#NOTE: Only cuda-8.x supports cross-platform development for aarch64(arm64).

I didn’t find cuda8 on your site, I only find cuda9.2 but It didn’t work.

I think you should be able skip the cuda for this sample.

Hi,
thank you for help, now I successfully run yuvJpeg example and in 10 sec I got 128 pictures in 640x480 resolution ~13fps.
But I need 14fps in 4K 3840x2160, how can I do that with this example.

Part of code:

nvidia@tegra-ubuntu:~/argus/build/samples/yuvJpeg$ ./argus_yuvjpeg --device=2 --duration=10
Executing Argus Sample: argus_yuvjpeg
Argus Version: 0.96.2 (multi-process)
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
CONSUMER: Waiting until producer is connected...
PRODUCER: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
CONSUMER: Acquired Frame: 1, time 1445775307
CONSUMER: 	Sensor Timestamp: 1445671050000, LUX: 365.309998
CONSUMER: 	IImage(2D): buffer 0 (640x480, 640 stride), 23 20 1e 1b 1b 19 18 18 17 17 17 17
CONSUMER: 	IImage(2D): buffer 1 (320x240, 640 stride), 72 8c 71 8c 74 8b 74 8a 73 88 74 8d
CONSUMER: 	Wrote JPEG: 1.JPG
CONSUMER: Acquired Frame: 5, time 1445838209
CONSUMER: 	Sensor Timestamp: 1445804283000, LUX: 363.971619
CONSUMER: 	IImage(2D): buffer 0 (640x480, 640 stride), 24 20 1e 1c 1b 18 19 18 18 18 18 18
CONSUMER: 	IImage(2D): buffer 1 (320x240, 640 stride), 73 8c 73 8a 74 8a 74 8a 72 8c 75 8c
CONSUMER: 	Wrote JPEG: 5.JPG
CONSUMER: Acquired Frame: 8, time 1445976534
CONSUMER: 	Sensor Timestamp: 1445937549000, LUX: 381.039001
CONSUMER: 	IImage(2D): buffer 0 (640x480, 640 stride), 2b 27 23 1f 1f 1d 1c 1c 1c 1c 1c 1c
CONSUMER: 	IImage(2D): buffer 1 (320x240, 640 stride), 6d 8a 6d 8d 71 8f 70 8c 72 8d 70 8b
...
...
CONSUMER: 	IImage(2D): buffer 0 (640x480, 640 stride), 2d 28 25 22 21 1e 1f 1b 1e 1c 1c 1b
CONSUMER: 	IImage(2D): buffer 1 (320x240, 640 stride), 82 83 7f 84 7e 84 79 85 7c 82 77 83
CONSUMER: 	Wrote JPEG: 275.JPG
CONSUMER: Done.
PRODUCER: Done -- exiting.

Hi MarkZer,
We may not be able to achieve 14fps 3840x2160. Please check if running jetson_clocks.sh helps.