I am trying to connect an external PCIe x1 device to the Jetson AGX Xavier devkit using the PCIe x8 expansion slot. The PCIe device is recognized by the Jetson and shows up in “lspci”. The issue is that when trying to load the drivers, the BAR0 and BAR2 values are off. The driver expects 32-bit values, and reads 0x4000_0000 and 0x4000_0200 values for BAR0 and BAR2 respectively. When running “lspci -vv” the memory regions for the device are shown as 0x1f_4000_0000 and 0x1f_4000_0200 for Regions 0 and 2. My understanding is that the Regions should always have a 32-bit address, as specified by bits 1&2 of the register being 0. What I am wondering is:
- Why is the memory region starting at a 64-bit address when it should be fixed to 32-bits?
- How can I change the memory region start address so that it is addressable with 32-bits?