Understanding Pinmux and DTB for Xavier AGX with TX2 Interface + 2ea Alvium 1800 Cameras

I am working with a Xavier AGX Dev Kit with a TX2 Camera interface board trying to modify the device tree to allow dual camera detection on Jetpack 5.0.2. Camera 1 works, however, I get timeout on i2c bus 7 (sudo i2cdetect -y -r 7). Hardware works fine with 4.6.1 jetpack. I understand and have the spreadsheet to modify the pinmux but the exact methods to do it correctly and generate / install the device tree elude me. Is there a tutorial that can guide my steps to achieve success?

Have a checking the sensor programing guide.

https://docs.nvidia.com/jetson/archives/l4t-archived/l4t-3261/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide/camera_sensor_prog.47.1.html#

I still can’t get the second camera detection to work on jetpack 5.0.2 with the 5.0.1-beta-1 driver. I get timeouts on bus7 (I2C8).

I have discovered that in analyzing the working 4.6.1 Jetpack with the 4.0.0 AV driver vs the 5.0.2 rev 1 jetpack and the 5.0.1-beta-1 driver the following anomaly is noted:

The I2C7(GEN8 I2C) timeout in ‘jetpack 5.0.2 + AV’s beta driver’ is caused due to the pinmux config mismatch. The pinmux register value should match with the working 4.0.0 driver. Expected pinmux register value in ‘jetpack 5.0.2 + AV’s beta driver’:
Reg: 0x0c302018 Val: 0x00000440 → gen8_i2c_sda_pdd2
Reg: 0x0c302020 Val: 0x00000440 → gen8_i2c_scl_pdd1

But the actual pinmux register value in ‘jetpack 5.0.2 + AV’s beta driver’:
Reg: 0x0c302018 Val: 0x00000540 → gen8_i2c_sda_pdd2
Reg: 0x0c302020 Val: 0x00000540 → gen8_i2c_scl_pdd1
The bit-8 E_LPDR should be 0, but it is 1 in ‘jetpack 5.0.2 + AV’s beta driver’ case and
that is causing the I2C7 timeout and failure of Camera-2 detection/probe in AV’s driver.

The device trees are identical between the working 4.6.1 jetpack and the 5.0.2 jetpack which times out.

The GEN8 I2C pinmux configuration in AV’s devicetree in working 4.0.0 driver
and latest beta driver are exactly the same.
Pinmux for GEN8 I2C in AV’s 4.0.0 driver
pinmux@2430000 {
gen8_i2c_pinctrl: gen8_i2c_pinctrl {
gen8_i2c_scl {
nvidia,pins = “gen8_i2c_scl_pdd1”;
nvidia,schmitt = <TEGRA_PIN_DISABLE>;
nvidia,lpdr = <TEGRA_PIN_DISABLE>;
nvidia,enable-input = <TEGRA_PIN_ENABLE>;
nvidia,io-high-voltage = <TEGRA_PIN_DISABLE>;
nvidia,tristate = <TEGRA_PIN_DISABLE>;
nvidia,pull = <TEGRA_PIN_PULL_NONE>;
};
gen8_i2c_sda {
nvidia,pins = “gen8_i2c_sda_pdd2”;
nvidia,schmitt = <TEGRA_PIN_DISABLE>;
nvidia,lpdr = <TEGRA_PIN_DISABLE>;
nvidia,enable-input = <TEGRA_PIN_ENABLE>;
nvidia,io-high-voltage = <TEGRA_PIN_DISABLE>;
nvidia,tristate = <TEGRA_PIN_DISABLE>;
nvidia,pull = <TEGRA_PIN_PULL_NONE>;
};
};
Pinmux for GEN8 I2C in AV’s latest beta driver
pinmux@2430000 {
gen8_i2c_pinctrl: gen8_i2c_pinctrl {
gen8_i2c_scl {
nvidia,pins = “gen8_i2c_scl_pdd1”;
nvidia,schmitt = <TEGRA_PIN_DISABLE>;
nvidia,lpdr = <TEGRA_PIN_DISABLE>;
nvidia,enable-input = <TEGRA_PIN_ENABLE>;
nvidia,io-high-voltage = <TEGRA_PIN_DISABLE>;
nvidia,tristate = <TEGRA_PIN_DISABLE>;
nvidia,pull = <TEGRA_PIN_PULL_NONE>;
};
gen8_i2c_sda {
nvidia,pins = “gen8_i2c_sda_pdd2”;
nvidia,schmitt = <TEGRA_PIN_DISABLE>;
nvidia,lpdr = <TEGRA_PIN_DISABLE>;
nvidia,enable-input = <TEGRA_PIN_ENABLE>;
nvidia,io-high-voltage = <TEGRA_PIN_DISABLE>;
nvidia,tristate = <TEGRA_PIN_DISABLE>;
nvidia,pull = <TEGRA_PIN_PULL_NONE>;
};
};
In both devicetree configs ‘nvidia,lpdr = <TEGRA_PIN_DISABLE>’ are the same. But
in jetpack 5.0.2 + AV’s beta driver scenario, this is not reflected in the pinmux register
(bit-8 E_LPDR should be 0, but pinmux register showing bit-8 as 1).

Can anyone explain why the register values are not as expected?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.