Jetson Nano DevBoard not pulling output pin to 0

We have a external device that we need to trigger a change to with the GPIOs. I have the Jetson Nano QSPI-SD B01 DevBoard to develop with.

We have 2 pins for this initial work. One is input and the external device will pull it high (3.3V) or low. We don’t want the Jetson to be doing any internal pulling. The other pin is an OUTPUT and it is active Low. The external device has a 100K pull up.

I currently have the PinMux (with the help of the community, see: Jetson Nano (SD DevBoard) 32.5.1 PinMux Changes?) set to use GPIO01 for the OUTPUT and GPIO13 for the input. These are the settings I have currently for the pins:

cam_af_en_ps5 {
    nvidia,pins = "cam_af_en_ps5";
    nvidia,function = "rsvd2";
    nvidia,pull = <TEGRA_PIN_PULL_NONE>;
    nvidia,tristate = <TEGRA_PIN_DISABLE>;
    nvidia,enable-input = <TEGRA_PIN_DISABLE>;
};

and

pe6 {
      nvidia,pins = "pe6";
      nvidia,function = "rsvd0";
      nvidia,pull = <TEGRA_PIN_PULL_NONE>;
      nvidia,tristate = <TEGRA_PIN_DISABLE>;
      nvidia,enable-input = <TEGRA_PIN_ENABLE>;
};

I use gpiod and the gpioget and gpioset commands to interact initially with the GPIOs.

When I set the GPIO01 (cam_af_en_ps5) pin to low with the command gpioset -m time -s 1 -l 0 149=1 my expected outcome does not happen. Now I hooked a multimeter up to the pins (ground and GPIO01) and instead of being pulled to 0 it gets pulled to 1.5V. Then when the pulldown finishes it doesn’t go back to 3.3V it just stays at 1.5V until I remove the GPIO line.

I also see that the Input pin is getting stuck at 1.5V as well when it should be either 3.3V or 0V.

As you can see from the pinmux settings I disabled pull and tristate for both pins.

Some additional information, We have used the exact same device with the PiGPIO library on a Raspberry Pi and have seen the expected results, so the external device seems to be functioning fine. And the logic seems to be fine as well because If I use the I2C pins on the DevBoard instead of the GPIO01 and GPIO013 pins we see the expected results. The only difference between the GPIO pins and the I2C pins that I can see is that the I2C pins have the 3.3V tolerance option and the GPIO pins do not. If I manually try and add nvidia,io-high-voltage = <TEGRA_PIN_ENABLE>; to the pinmux settings I can flash the device but on boot it gets stuck in a weird state. (different than my previous forum question)

Has anyone seen a similar issue before, could someone try and give me a hand with this?

We’re checking if thee is anything hardware need to aware for connecting the GPIO pin, will update soon.

Hi, please check the 40-pin docs in DLC first.

https://developer.nvidia.com/jetson-nano-developer-kit-40-pin-expansion-header-gpio-usage-considerations-applications-note

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.