Regarding remote display versus local display in X11…anything running in the X environment generates a series of events, and those events are what the X server deals with, eventually producing graphics rendering through the graphics card as a side effect of those events being processed.
Locally driven displays have a fairly direct route to reaching video card rendering; events are somewhat delayed when instead going to a remote system and its security. This can be fast remotely if you are looking at control or vector operations; if rendering bitmaps, then you might be sending an event for each pixel.
Differences do not stop there though. When running a program on a remote Jetson and displaying locally to a PC none of the actual rendering libraries (the things a GPU talks to) run on the Jetson…these offload to the desktop PC. If you have hardware acceleration on the Jetson, then it no longer participates. Rendering via GPU instead goes through the desktop PC and its libraries. Not knowing this can be a big shock…in some cases the PC won’t be very fast, and in others (such as CUDA) you might find that the 1080Ti is doing the CUDA instead of the Jetson (and if you were not aware of this and think the Jetson is running that fast you’re in for an unpleasant surprise). If the desktop PC did not have the correct version of CUDA then the program would completely fail.
Somewhere in the middle, if you are serious about using the Jetson’s computing power, yet want to display on a PC, you’ll need some form of virtual desktop. The Jetson would render to a virtual screen which has no actual hardware connected, but the GPU and CUDA would not know or care…the remote PC then gets updated via this virtual desktop instead of via X events. In that case the PC would not need CUDA of its own for the Jetson to do CUDA work and to display correctly no matter what the PC configuration is (this would even be operating system agnostic).