I’m using IMX274 modules from Leopard Imaging and their driver patches for JP3.1 for the TX1.
When patches applied, kernel built, and TX1 updated, nvgstcapture works when switching between sensors. I can open two instances of the argus_camera demo app with one for each sensor.
But when I run the argus_multiSensor demo app I get in /var/log/syslog
Sep 7 02:07:46 tegra-ubuntu kernel: [ 684.290421] tegra_mipi_cal 700e3000.mipical: DSIB_MIPI_CAL_CONFIG_2 0x68 0x00000000 Sep 7 02:07:46 tegra-ubuntu kernel: [ 684.298806] tegra_mipi_cal 700e3000.mipical: DSIC_MIPI_CAL_CONFIG_2 0x70 0x00000000 Sep 7 02:07:46 tegra-ubuntu kernel: [ 684.307186] tegra_mipi_cal 700e3000.mipical: DSID_MIPI_CAL_CONFIG_2 0x74 0x00000000 Sep 7 02:07:46 tegra-ubuntu argus_daemon: SCF: Error Timeout: (propagating from src/services/capture/CaptureServiceEvent.cpp, function wait(), line 59) Sep 7 02:07:46 tegra-ubuntu argus_daemon: Error: Camera HwEvents wait, this may indicate a hardware timeout occured,abort current/incoming cc Sep 7 02:07:46 tegra-ubuntu argus_daemon: launchCC abort cc 104 session 2 Sep 7 02:07:46 tegra-ubuntu argus_daemon: SCF: Error Timeout: (propagating from src/api/Session.cpp, function capture(), line 830) Sep 7 02:07:46 tegra-ubuntu argus_daemon: (Argus) Error Timeout: Failed to submit first capture request (propagating from src/api/CaptureSessionImpl.cpp, function submitCaptureRequests(), line 311) Sep 7 02:07:46 tegra-ubuntu argus_daemon: (Argus) Error Timeout: (propagating from src/api/CaptureSessionImpl.cpp, function threadFunction(), line 777) Sep 7 02:07:46 tegra-ubuntu argus_daemon: PowerServiceCore:handleRequests: timePassed = 678 Sep 7 02:07:47 tegra-ubuntu kernel: [ 684.567252] nvmap_alloc_handle: PID 2875: argus_multisens: WARNING: All NvMap Allocations must have a tag to identify the subsystem allocating memory.Please pass the tag to the API call NvRmMemHanldeAllocAttr() or relevant.
The demo app is supposed to show video to screen from one sensor and write other sensor to disk. The one that goes to screen goes gray. and program never ends. (and only 2 jpegs are written Argus_0000.jpg and Argus_0001.jpg, and they are black).
In our custom application, we are basing things off the GstVideoEncode and ArgusMultiSensor samples to have simultaneous streaming of multiple camera modules into our software for later processing and compression.
We see similar timeouts when we try to run with 2 cameras, with 1 camera our software runs as it did in earlier Jetpacks (2.4). With previous Jetpacks 2 cameras had no issue.
Are there changes required in how multiple simultaneous streams are handled with JP3.1? Is there a driver issue? It doesn’t seem to be driver related as the argus_camera apps running simultaneously show the video can be pushed through the system, but since its a time-out not totally sure.