Green Screen Live Output from Leopard IMX477 Camera + TX2 Developer Kit

Hi,

I recently bought 4 x Leopard Imaging IMX477 cameras, 1 x Multi Camera Adapter with 6 ports and TX2 developer kit. LI sent 2 .txt files for the instructions to install necessary drivers for the cameras. I followed every step correctly and expected to get a live video output from IMX477 camera, but ended up with green screen. According to Leopard Imaging’s guides, there are several ways to get the live video output from the camera:

1) using nvgstcapture-1.0, I should have got the live output. The only thing important here is to make sure that there is a camera on J1 port of the multi camera adapter, which I did make sure there is. Running the following command on the terminal should have been enough to get the live video output:

nvgstcapture-1.0

But somehow, all I can see is a green screen.

Messages on the terminal:

vid_rend: syncpoint wait timeout
vid_rend: syncpoint wait timeout
vid_rend: syncpoint wait timeout
vid_rend: syncpoint wait timeout
Socket read error. Camera Daemon stopped functioning.....
** Message: <main:5374> Capture completed
** Message: <main:5424> Camera application will now exit

2) using gstreamer is another way to capture live image. According to LI’s guides, running the following command should have worked:

gst-launch-1.0 nvcamerasrc fpsRange="20.0 20.0" sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)4056, height=(int)3040, format=(string)I420, framerate=(fraction)20/1' ! nvtee ! nvvidconv flip-method=2 ! 'video/x-raw, format=(string)I420' ! xvimagesink -e

But again, the output is nothing but a green screen.

Messages on the terminal:

Received error from camera daemon....exiting....
Socket read error. Camera Daemon stopped functioning.....
Got EOS from element "pipeline0"
Execution ended after 0:00:16.224172704
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

3) using argus is another way to do this. After the installation of argus software (which I did), I should have been able to get the live video output by running the following command on terminal:

argus_camera --device=0

should have given the output of the camera on J1 port. Argus application opens correctly, but can’t display any image when I push the ‘Capture’ button.

Messages on the terminal:

Executing Argus Sample Application (argus_camera)
Argus Version: 0.96.2 (multi-process)
(Argus) Error EndOfFile: Unexpected error in reading socket (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 212)
(Argus) Error EndOfFile: Receiving thread terminated with error (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadWrapper(), line 315)
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 94)
(Argus) Error InvalidState: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 101)
Segmentation fault (core dumped)

4) Using VideoCapture in OpenCV is also a way to capture images. According to Leopard Imaging’s guides, following code should have worked:

VideoCapture cap("nvcamerasrc ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720,format=(string)I420, framerate=(fraction)24/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! 'video/x-raw, format=(string)BGR' ! appsink");

Code is compiling, but I get the following error when I run it:

VIDEOIO ERROR: V4L: device nvcamerasrc ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720,format=(string)I420, framerate=(fraction)24/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! 'video/x-raw, format=(string)BGR' ! appsink: Unable to query the number of channels

I looked it up on the internet and found a solution, which helped me with this error. I added CAP_FFMPEG in the second argument of VideoCapture object and tried to display the image:

VideoCapture cap("nvcamerasrc ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720,format=(string)I420, framerate=(fraction)24/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! 'video/x-raw, format=(string)BGR' ! appsink", CAP_FFMPEG);
namedWindow("Test", CV_WINDOW_KEEPRATIO);
bool loop = true;
Mat image;
while(loop){
  bool readImage = cap.read(image);
  imshow("Test", image);
  if(waitKey(10) == 27){
    loop = false;
  }
}
destroyAllWindows();

But I get the following error (which is probably because the captured image is empty or image is not captured at all, or the colorspace is different somehow) when I run it:

OpenCV Error: Assertion failed (size.width>0 && size.height>0) in imshow, file /home/nvidia/src/opencv-3.4.0/modules/highgui/src/window.cpp, line 331 terminate called after throwing an instance of 'cv::Exception'
   what(): /home/nvidia/src/opencv-3.4.0/modules/highgui/src/window.cpp:331: error: (-215) size.width>0 && size.height>0 in function imshow

<b>The program has unexpectedly finished</b>

I looked it up on the internet to find a solution. Some people said the camera produces raw data and somehow I have to convert the image to RGB. I tried some methods of colorspace transformation with cvtColor function with different arguments, but they also didn’t help in OpenCV.

I should also tell that WiFi module of TX2 Developer Kit stopped working after the installation of camera drivers. I am running the following command to list all installed network drivers:

sudo lshw -C network

Normally I should have seen Wireless Interface too, but I can’t see it anymore after the installation of camera drivers. I can only see ethernet interface.

Is there anyone who faced the same/similar problems and managed to solve them, or just give me some advice about them ?

Hi eray.varyeter,

I believe the driver guide you are using is for LI-JTX1-MIPI-ADPT (3-camera) adapter board.
We haven’t released the driver for LI-JTX1-MIPI-ADPT-6CAM (6-camera) adapter board yet. http://leopardimaging.com/driver-support/

Actually, we have a preliminary version of the IMX477 TX2 driver for LI-JTX1-MIPI-ADPT-6CAM adapter board. It only supports 1080P. Please give it a try.

Hi,

Thanks for your reply. Eventhough we ordered 6 port adapter, 3 port adapter was also shipped to us by mistake, so we also have 3 port adapter right now. I’ll check if I can get live image from the cameras by using 3 port adapter. Then I’ll check the new 6 port driver guide that you published to see if it works with 6 port adapter. Then I’ll let you know about the results.

When will the driver guide for 6 ports be released officially ? Because we don’t want to have to flash the board again and again, we are in a little hurry in our project right now.

Hi eray.varyeter,

We may release the driver of 6 cameras in a few weeks or earlier. Sorry for no ETA yet.
Do you mean you received wrong parts? If so, please contact our Sales (sales@leopardimaging.com) for assistance.

Hi,

I tried the 3 port adapter, and it is working fine now. I can get live images from gstreamer, argus and OpenCV also. Now, I am trying to figure out how to select a camera in OpenCV. Command below is directly capturing image from camera which is on J1 port.

VideoCapture cap("nvcamerasrc ! video/x-raw(memory:NVMM), width=(int)1280, height=(int)720,format=(string)I420, framerate=(fraction)24/1 ! nvvidconv flip-method=2 ! video/x-raw, format=(string)BGRx ! videoconvert ! 'video/x-raw, format=(string)BGR' ! appsink");

How can I select the camera that I want to capture image from ?

edit: I found the way to do it. After nvcamerasrc, I added sensor-id=1 to get the 2nd image.

And the other question is how can we can be informed when you release the official driver for 6 port adapter ? Because our aim is to use 4 cameras with 6 port adapter at the end.

Hi eray.varyeter,

Please let us know your Email address. We will put it in our list and give you an update once we get the driver ready.