We are developing a camera system based on the NVIDIA Jetson Xavier NX computing platform. The system has three imagers connected to the MIPI interface of the Jetson. The sensor used on the image boards is a Sony IMX334 for which the driver has been adapted to work with the Jetson Xavier NX.
The intended configuration is 4K (3840x2160) with 25 fps for each imager. The image data is accessed from a C++ application using the NVIDIA Argus library.
Jetpack version used: 4.4
The image acquisition works well in general, but there are sporadic degradations of the image quality as can be seen in the image attached (crop: left good, right bad)
The images have been recorded with the same setup, but only at different days. It can be clearly seen that the right image has some artefact on pixel level. As a result, straight lines are not straight anymore but have small steps, text is less readable, and the overall image quality appears to be significantly degraded.
The same effect can be observed with different systems and settings and the negative effect can also disappear again after some time. Up to now, it was not possible to reliably reproduce this behavior, but it only occurs randomly.
As the problem occurs not only in our application but also in image recorded by other means (e.g. gstreamer) and it seems to be independent of the application configuration, it can be assumed that it originates not from of the application.
Additionally, using bayer pattern input instead of debayered color images and performing the debayering in software always results in images of good quality.
Combining these two observations, it can be concluded that the degraded image quality originates from the image processing of the ISP integrated into the Jetson Xavier NX. Based on our current understanding, the wrong computation in the ISP might be caused by a wrong configuration (e.g. device tree, clock multiplier, line length, pixel clock…), an overflow of the ISP resources (e.g. using it out of spec) or a malfunction of the ISP.
The use of three imagers with 4K with 25 fps should not generate too much data for the ISP to process. There are examples of much more cameras and higher framerates. The configuration of the device tree might influence the behavior of the ISP but so far, no change in the image quality could be observed when changing the settings.
- What can be the cause for such image artefacts?
- How is the ISP affected by the device tree settings?
- How is the ISP affected by settings in the ISP configuration file?
- Are there other factors which could influence the ISP performance?
Many thanks for the support in advance.