To help understand the problem I’ll give some background first. It’ll seem to not matter, but it is the basis for the question.
X was designed to be a distributed system by a bunch of M.I.T. students. As such, the X server is on any PC or workstation, and what gets rendered is an interpretation of X “events”. They’re basically a table of numbers (and perhaps data arguments). When you type on a keyboard, or when a program updates graphics, one or more events are sent.
One type of remote software involves running an X application on a given computer, e.g., a Jetson, but redirecting events to another host (e.g., your host PC). In that case the GPU of the Jetson is never involved; the program itself sends and receives events between itself and the host PC (perhaps via ssh
forwarding of X events, e.g., using the “ssh -X
” or “ssh -Y
” command and then running a GUI program on that command line).
Another type of remote software is essentially not remote at all, but does send and receive copies of X event results rather than events. This is a special server. The default server runs the result of events into a framebuffer which is expected to have a monitor physically attached to it, but the GUI program itself does not ever really care about that. The normal X server sees events and then updates the buffer, and if there is a monitor which happens to be watching the buffer, everyone is happy.
Now if we want to forward to another computer, but we want the work done on the Jetson, it means we have to use the Jetson’s framebuffer, but we can’t expect to have a monitor attached. That implies the X server itself needs the option to keep rendering even if there is no monitor. The first issue is that the framebuffer has to know what the monitor resolution is (and other things) before it can properly arrange a framebuffer (imagine if a missing monitor resulted in a single line of pixels that’s really long; or just defaulted to some antique device). This is what a virtual desktop server starts with: An ordinary X server, but an ability to artificially tell the server to “pretend” that it has a monitor of a given setting.
If you have that virtual monitor, then it becomes possible to send the results of events to another computer, and a special program on that other computer can display the results (it’s basically just compressed realtime pixel data). No event is ever sent in that case.
There is a log file for the Jetson side’s server if and only if the GPU there is what interprets the events. If this happens to be a virtual server, then there is a log for the virtual server (named based on the “$DISPLAY
” context) on the Jetson with that server. If there is a regular server running on the Jetson, and one forwards events instead of the result of events, then there is no log on the Jetson. The host PC you are using at the time of receiving the data, if it uses X, will have its own X server log. This log will not tell you anything about the virtual desktop application…that is just another X application, e.g., it works just like Gimp or Blender. It is true that the virtual desktop client you run from the host PC could have its own log, but it won’t be an X log (X logs are about events, and configuration of the buffer).
So yes, on the Jetson end, there will be an Xorg.#.log
for every server which runs. There can be multiple servers. It might be that a local monitor runs on one server, and the virtual server is different, whereby they both have logs; or it might be the monitor is plugged in such that the virtual server is the framebuffer the server services…then there will be only one log, and it won’t care whether or not the results of X rendering is also copied to your virtual desktop on another computer.
You might examine “echo $DISPLAY
” in these cases:
- A locally attached monitor and keyboard.
- Your host PC, but not within the virtual desktop client (this is the host PC’s context, not the Jetson’s).
- Your host PC, whereby you run this within a terminal that is itself within the virtual desktop client (this might match the locally attached monitor and keyboard; it depends on whether there is a second server versus sharing the local and remote via the same framebuffer).
Check “ls -ltr /var/log/Xorg.*.log
”, and note these are sorted by time; if two logs are created at almost the same time, then it implies you have two servers.
Something important to note is that if you run locally only, then you know the GPU is being used locally, and this is what might limit frame rate. If you are running remotely, and via forwarding of events (e.g., “ssh -X
” or “ssh -Y
”), then you are limited by the host PC’s GPU, not the Jetson, and you are further limited by the networking between the two. If you are using a virtual desktop client, then the framerate is basically whatever the Jetson is doing, but the display on the host PC might be further limited by networking (if the Jetson updates 1500 frames per second, and all pixels change in each frame, then that is an enormous amount of data for a gigabit network…events are tiny so far as data size, whereas a framebuffer is a lot of data…compression helps, but compression depends on the particular data).
One question which always exists, if you have installed a virtual desktop server: Is the virtual server using the GPU, or is it instead reverting to software rendering? To answer that, from your virtual desktop on the host PC, you might run this command (from the virtual desktop or local to the Jetson you might need to “sudo apt-get install mesa-utils
”):
glxinfo | egrep -i '(version|nvidia)'
The output of this command will change depending on whether the GPU is used or if it is software rendered. If it is an NVIDIA driver, you’re in good shape; if it is Mesa, or some other software version, then you’ll not get the performance until the virtual server is changed to use the GPU. Even if it is the NVIDIA GPU you might find the remote end is slower due to networking. Perhaps if you have a local monitor which is using the virtual desktop, and if that desktop shows the NVIDIA GPU being used, then that could be a lot higher frame rate than the virtual desktop client. Maybe.