GStreamer Plugin Custom Properties and Modes Behavior

Hi All,

I need some guidance concerning the behavior of our 3D LUT image processing GStreamer plugin when real time mode changes occur. Lets say I have a processing mode and pass through mode. Should I reroute the plugins’ in/out pads to bypass the plugin or have the plugin just do no processing and the image data comes in and goes right back out and leave the routing of the pads alone?

Second question is if another process is creating the 3D LUT for the plugin to use what is the best way to have the plugin aware that the new 3D LUT is available. Can I have a custom property where I set the cuda memory location where the new 3D LUT is stored or ?

Thanks in advance for all your ideas.

Hi,
Please check what we have suggested:
https://devtalk.nvidia.com/default/topic/1067096/jetson-nano/most-efficient-method-to-collapse-merge-multiple-3d-luts-into-one/post/5405067/#5405067

Not sure but it can be limited to run 3D LUT in gstreamer. Probably you can try tegra_multimedia_api.

Hi,
Please share the reason that you would like to move the OpenGL ES application to gstreamer.

Per
https://devtalk.nvidia.com/default/topic/1067094/jetson-nano/possible-to-use-3d-textures-for-3d-lut-implementation-or-/post/5405419/#5405419
3D texture should work fine on L4T releases. Have you seen any error in running the 3D LUT application on Jetson Nano?

We are using the GStreamer media framework for our player and we using plugin(i.e. elements) to extend GStreamer to perform 3D LUT processing in the image chain. Unfortunate GStreamer does support OpenGL 3D textures for OpenGL ES versions. So this question is geared toward a GStreamer expert that can help get 3D textures implemented some other way.

Hi,
Here is a sample of decoding info NvBuffer through gstreamer pipeline:
https://devtalk.nvidia.com/default/topic/1058086/deepstream-sdk/how-to-run-rtp-camera-in-deepstream-on-nano/post/5369243/#5369243
For local video playback, you can replace ‘rtspsrc ! rtph264depay’ with ‘filesrc ! qtdemux’.

When we get NvBuffer fd, we can call below APIs to get EGLImage:

/**
* Creates an `EGLImage` instance from `dmabuf-fd`.
*
* @param[in] display `EGLDisplay` object used during the creation of `EGLImage`.
*                    If NULL, the nvbuf_utils API uses its own EGLDisplay instance.
* @param[in] dmabuf_fd `DMABUF FD` of buffer from which `EGLImage` to be created.
*
* @returns `EGLImageKHR` for success, `NULL` for failure
*/
EGLImageKHR NvEGLImageFromFd (EGLDisplay display, int dmabuf_fd);

/**
* Destroys an `EGLImage` object.

* @param[in] display `EGLDisplay` object used to destroy `EGLImage`.
*                    If NULL, the nvbuf_utils API uses its own EGLDisplay instance.
* @param[in] eglImage `EGLImageKHR` object to be destroyed.
*
* @returns 0 for success, -1 for failure
*/
int NvDestroyEGLImage (EGLDisplay display, EGLImageKHR eglImage);

If it is possible to perform OpenGL 3D texture on EGLmage, this sample should be a good reference. FYR.