Degraded image quality issue on Xavier NX: ISP debayer/demosaic/Bayer-RGB conversion


We are developing a camera system based on the NVIDIA Jetson Xavier NX computing platform. The system has three imagers connected to the MIPI interface of the Jetson. The sensor used on the image boards is a Sony IMX334 for which the driver has been adapted to work with the Jetson Xavier NX.

The intended configuration is 4K (3840x2160) with 25 fps for each imager. The image data is accessed from a C++ application using the NVIDIA Argus library.

Jetpack version used: 4.4

Problem Description

The image acquisition works well in general, but there are sporadic degradations of the image quality as can be seen in the image attached (crop: left good, right bad)

The images have been recorded with the same setup, but only at different days. It can be clearly seen that the right image has some artefact on pixel level. As a result, straight lines are not straight anymore but have small steps, text is less readable, and the overall image quality appears to be significantly degraded.

The same effect can be observed with different systems and settings and the negative effect can also disappear again after some time. Up to now, it was not possible to reliably reproduce this behavior, but it only occurs randomly.

Further Deductions

As the problem occurs not only in our application but also in image recorded by other means (e.g. gstreamer) and it seems to be independent of the application configuration, it can be assumed that it originates not from of the application.

Additionally, using bayer pattern input instead of debayered color images and performing the debayering in software always results in images of good quality.

Combining these two observations, it can be concluded that the degraded image quality originates from the image processing of the ISP integrated into the Jetson Xavier NX. Based on our current understanding, the wrong computation in the ISP might be caused by a wrong configuration (e.g. device tree, clock multiplier, line length, pixel clock…), an overflow of the ISP resources (e.g. using it out of spec) or a malfunction of the ISP.

The use of three imagers with 4K with 25 fps should not generate too much data for the ISP to process. There are examples of much more cameras and higher framerates. The configuration of the device tree might influence the behavior of the ISP but so far, no change in the image quality could be observed when changing the settings.


  • What can be the cause for such image artefacts?
  • How is the ISP affected by the device tree settings?
  • How is the ISP affected by settings in the ISP configuration file?
  • Are there other factors which could influence the ISP performance?

Many thanks for the support in advance.

Try to disable the sharpness from the argus_camera UI to try otherwise may need contact with camera partner for tuning process.

As stated above:
We tried it with all available settings. We also tried to use/modify ISP files to fix this issue.
It is not depending on settings. It happens sporadically even without changing any setting.
The only thing what helps is to disable ISP Bayer->RGB conversion and do this in software.
However from the performance this is prohibitive.

We build our own camera and do not work with a partner. The camera is intended to be shipped in high volumes soon.

Since this problem is that severe I would assume that it is known at NVidia. Could you maybe further investigate?

What the badge info in you device tree.

		modules {
386 			module0 {
387 				badge = "e3326_front_P5V27C";
388 				position = "rear";

Here is this part from the device tree:

modules {

cam_module0: module0 { 
	badge = "dc_top_camera"; 
	position = "top"; 
	orientation = "1"; 
	cam_module0_drivernode0: drivernode0 { 
	pcl_id = "v4l2_sensor"; 
	devname = "imx334 30-0037"; 
	proc-device-tree = "/proc/device-tree/i2c@3180000/i2c_mux@70/i2c@0/cam_imx334_a@37"; 

cam_module1: module1 { 
	badge = "dc_center_camera"; 
	position = "center"; 
	orientation = "1"; 
	cam_module1_drivernode0: drivernode0 { 
	pcl_id = "v4l2_sensor"; 
	devname = "imx334 31-0037"; 
	proc-device-tree = "/proc/device-tree/i2c@3180000/i2c_mux@70/i2c@1/cam_imx334_b@37"; 
cam_module2: module2 { 
	badge = "dc_bottom_camera"; 
	position = "bottom"; 
	orientation = "1"; 
	cam_module2_drivernode0: drivernode0 { 
	pcl_id = "v4l2_sensor"; 
	devname = "imx334 32-0037"; 
	proc-device-tree = "/proc/device-tree/i2c@3180000/i2c_mux@70/i2c@2/cam_imx334_c@37"; 


There’s no problem for badge to load incorrect ISP setting.
If you don’t like tuning from the partner I would like suggest to using yuv sensor or have external ISP like YUV sensor does.