Jetson AGX serial connection

Hello, I have a Jetson Xavier AGX running Jetpack 4.5.1 and I am trying to connect a Microstrain IMU sensor using the GPIO 5v/gnd/UART tx+rx pins in the expansion header but after closer inspection I believe the IMU is using RS232 protocol to communicate and I am not sure if the AGX can establish such a connection and if so how I can do it any help would be appreciated

Hi general laser,

This link is about serial communication and you can find other useful links inside this subject.

Technically RS-232 uses the D-Sub connector and runs over a wide range of voltages. The UARTs of the Xavier run only at 3.3V (TTL) logic level. A true RS-232 PHY would damage the 3.3V interface if connected. The protocol is the same though. Speeds, stop bits, parity, use of CTS/RTS, so on, are 100% compatible with what an RS-232 protocol would use. If you have a D-Sub connector, or if the logic level is not 3.3V, then you will need a hardware adapter or level shifter.

Thank you for the answers, this is general-laser, I just had to switch accounts, We managed to get a level shifter to provide the correct voltage levels to use RS232, but after trying it with two different Xavier AGX units, but we still fail to communicate with the device properly. We did try using a RS232-USB connector and also a USB-UART combined with the level shifter and both ways worked fine, but connecting the level shifter to the pins on the expansion header doesn’t. Is there anything further we have to do on the Xavier to be able to communicate with the IMU?

Both UARTs must have the same settings:

  • speed
  • stop bits
  • parity
  • flow control

If communications exceeds speed 115200 (the default is 115200 8N1 without hardware flow control), then you will need both sides to use two stop bits.

I have found level shifters are sometimes problematic, but I don’t know if this is related to this particular case. Timing can change, but more often the pull-up/down resistors might need change if levels are not holding (use an oscilloscope to see if the Jetson size is truly 3.3V and if the other side is what it needs to be). Make sure to check if voltages not only go high enough, but also if they go low enough.

Is it possible to change or at least find out the serial buffer size of the Xavier for the UART? Also, if I need to add an additional stop bit to the Xavier, could you give me some pointers as to how to do that? Thanks

There might be more than one buffer associated with this. One would be a tiny buffer in the hardware itself, and not adjustable, while there could possibly also be another buffer. Unfortunately, someone else will have to provide details on that as I don’t know about customization related to this.

All of this is complicated by the combined ability to use the integrated UARTs using an older “standard” driver (e.g., as a 16550A), as well as when using the Tegra High Speed driver (for example, “/dev/ttyS1” is using the older driver, whereas “/dev/ttyTHS1” is the same hardware, but using the DMA Tegra High Speed driver). Much of the information you might normally get via setserial -a <device> does not apply when using the THS driver. I do not know the proper way of gathering and modifying specifications for this. Someone from NVIDIA can probably provide more information an reading and setting the THS settings, especially any buffer size associated with this.

We used an oscilloscope and measured that for the Xavier 3.3V seems to be the logic 0 on Tx and 0V the logic 1, which as I understand is uncommon for UART and might possibly be causing some issues, unless I am missing something or our measurements are wrong? Otherwise the settings in regards to speed/stop bits etc. are all the same between the two, I’m just kind of at a loss as to where the issue could lie

I couldn’t tell you about the specifics at the protocol analyzer level, but if pull up/down is somehow wrong, then this would of course be a good reason for a partial failure. I am uncertain from the description if one of the TX or RX is failing to reach 3.3V or not, but if the signals look like valid 3.3V on both TX and RX, then you might try a protocol analyzer. If the signal is not correctly transitioning between low and high, then this would be an immediate cause of failure. In the failure case, if you describe exactly how low and how high the signal is on both TX and RX, then someone from NVIDIA could verify if the levels are valid or not.

Note: If rise and fall times and timing are out of spec, then this too could be an issue. A protocol analyzer would answer this if the signal otherwise looks to be achieving correct 1 and 0 voltages.