JetPack 3.2.1
I have a TX2 with GLX display output rendering, and also with an off screen Pbuffer renderer, doing some shader based image manipulation, but using EGL.
This seems to work from application start-up quite well. The off screen render creates an output image which is read back using a PBO mapping with memcpy. Further CPU based modifications are then done and the image is passed to the display system, where it outputs exactly what I would expect to see.
The thread doing the Pbuffer rendering is, from time-to-time, killed off as it’s no longer used, but the application keeps running. At this point the threads C++ class cleans up all it’s buffers, textures, surfaces and contexts.
The thread is then rebuilt later and goes through the exact same startup sequence as the first time.
The calls to get the display, initialise EGL, bind the API, choose a pBuffer compatible configuration, create a Pbuffer surface and finally to create a context all return without error (0x3000 = SUCCESS) just like the first time.
But, the the call to make the context current blows up with the below stack trace. I’ve no idea how I can debug this any further. The graphics debugger doesn’t help - since the app uses a mix of EGL and GLX, it stops the session as soon as the first GLX call is made (in the display subsystem).
To be clear, this all works, once, but on the second attempt it segfaults.
Are there any methods I can use to resolve this issue? Is it reasonable to use EGL and GLX in the same application?
Here’s the stack trace:
[Switching to Thread 0x7f499df1e0 (LWP 14164)]
0x0000007f882f1cf8 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
(gdb) bt
#0 0x0000007f882f1cf8 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#1 0x0000007f882f510c in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#2 0x0000007f883072cc in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#3 0x0000007f882edf4c in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#4 0x0000007f883043f8 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#5 0x0000007f882ee604 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#6 0x0000007f882f66f8 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#7 0x0000007f88304c30 in ?? () from /usr/lib/aarch64-linux-gnu/tegra/libnvidia-eglcore.so.28.2.1
#8 0x0000007f8736195c in ?? () from /usr/lib/aarch64-linux-gnu/tegra-egl/libGLESv1_CM_nvidia.so.1
#9 0x0000007f9712b664 in ?? () from /usr/lib/aarch64-linux-gnu/tegra-egl/libEGL_nvidia.so.0
#10 0x0000007f9712c6d8 in ?? () from /usr/lib/aarch64-linux-gnu/tegra-egl/libEGL_nvidia.so.0
#11 0x0000007f97133074 in ?? () from /usr/lib/aarch64-linux-gnu/tegra-egl/libEGL_nvidia.so.0
#12 0x0000007f9735dfbc in eglMakeCurrent () from /usr/lib/aarch64-linux-gnu/tegra-egl/libEGL.so.1
#13 0x0000007f8ef663d4 in cHiddenContextEGL::Create (this=0x7f1829a660, nRequestedWidth=1280, nRequestedHeight=720) at cFusion.cpp:374
I snipped off the stack frames that would mean little to anyone else.
Thanks,
Ratbert.