Our setup is as follows:
-) Jetson TX2 Development Kit
-) D3 Engineering Jetson SERDES Sensor Interface Card (https://store.d3engineering.com/product/designcore-nvidia-jetson-serdes-card/)
-) 4x D3RCM-IMX390-953 Rugged Camera Module
The Sensor Interface Card provides a DS90UB960 4:1 MIPI-Hub. D3 currently only supports 1 camera on the Hub, however we try to make full use of the hub.
Therefore, we modified the device tree, added the vc-id parameter for virtual channel and now can capture all cameras simultaneously using the following command:
v4l2-ctl --set-fmt-video=width=1936,height=1100,pixelformat=RG12 --stream-mmap -d /dev/videoX --set-ctrl bypass_mode=0
Now we want to display the streams captured from the cameras, and here starts our problem:
We cannot use a gstreamer pipeline with v4l2 because it only supports 8Bit Bayer formats. Therefore, we tried to use libargus.
Using the command
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! nvvidconv ! xvimagesink
displays correctly /dev/video0, however using the command
gst-launch-1.0 nvarguscamerasrc sensor-id=1 ! nvvidconv ! xvimagesink
opens the device, however does not display the stream.
Does libargus support this kind of configuration?
Does this kind of error indicate an error in our device tree?
It this a problem caused by using virtual channel?
I’m an embedded software engineer with D3 and I’m quite familiar with your hardware and software configuration.
It’s nice to hear that you’ve modified the source to support virtual channels. We’re planning to officially support virtual channels on both the serdes card you have and our Xavier 16 channel serdes card in our next BSP release. If you are at liberty to share your modifications I’d love to see them as they may help us bring official support more quickly.
Which D3 BSP did you base your work off of? Very recently we released 2.0.0 which supports the latest JetPack v4.2.2. If possible I do recommend switching to 2.0.0. I can provide you with the binaries and source code. The source code is readily available on GitHub too.
I believe the problem you have is related to a timing issue with nvargus-daemon. What we have seen is that the first pipeline starts successfully but when the second pipeline starts the first stops displaying frames. We have raised this issue with Nvidia and are working to find the root cause of the issue. Fortunately we discovered a workaround. Enabling the infinite camera timeout mode of nvargus-daemon results in multiple streams running successfully.
An easy way to permanently enable infinite camera timeout is to execute:
sudo sed -i '/^\[Service\]$/ a Environment="enableCamInfiniteTimeout=1"' /etc/systemd/system/nvargus-daemon.service