Hi I am trying to display video by connecting 2 cameras to jetson xavier board. I checked the example of binding EGLStream and GL_TEXTURE_EXTERNAL_OES, in this case can I pass multi textures from 2 cameras to the fragment shader at the same time?
Hi,
Please share which type of camera you are using. It is a Bayer sensor, YUV sensor, or USB camera. A possible solution is to capture frame data into NvBuffer and then composite the frames into single vide plane through hardware converter. This is to use jetson_multimedia_api and we have samples in
/usr/src/jetson_multimedia_api
Thank you for the reply. I built the argus camera application source code to display multi-camera images. What I want is to get the two camera textures from the fragment shader. The fragment shader code in the sample code is as below.
static const char frgSrc =
“#version 300 es\n”
“#extension GL_OES_EGL_image_external: require\n”
“precision highp float;\n”
“uniform samplerExternalOES texSampler;\n”
“in vec2 vTexCoord;\n”
“out vec4 fragColor;\n”
“void main() {\n”
" fragColor = texture2D(texSampler, vTexCoord);\n"
“}\n”;
The camera texture is glBindTexture(GL_TEXTURE_EXTERNAL_OES, it->m_consumer->getStreamTextureID()); I have bound it in code, but I wonder if there is a way to pass two textures.
Hi,
Please check if this helps:
c++ - How can I add multiple textures to my openGL program? - Stack Overflow
And there are samples in
/usr/src/nvidia/graphics_demos
Please take a look and see if there is either sample similar to your use-case.
Hi,
Please also check this sample:
examples/06-multitexture.cpp at master · openglredbook/examples · GitHub
Thanks for your reply. I’ve tried multitexture development but still no success. I have tried the method below.
- EglStream Consumer
// camera 1
glActiveTexture(GL_TEXTURE0)
glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_streamTexture0);
if (!eglStreamConsumerGLTextureExternalKHR(Composer::getInstance().getEGLDisplay(), m_eglStream))
{
ORIGINATE_ERROR(“Unable to connect GL as consumer”);
}
glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0);
…
// camera 2
glActiveTexture(GL_TEXTURE1)
glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_streamTexture1);
if (!eglStreamConsumerGLTextureExternalKHR(Composer::getInstance().getEGLDisplay(), m_eglStream))
{
ORIGINATE_ERROR(“Unable to connect GL as consumer”);
}
glBindTexture(GL_TEXTURE_EXTERNAL_OES, 0);
-
Shader Code
static const char frgSrc =
“#version 300 es\n”
“#extension GL_OES_EGL_image_external : require\n”
“#extension GL_ARB_explicit_uniform_location : require\n”
“precision highp float;\n”
“layout(location = 2) uniform samplerExternalOES texSampler1;\n”
“layout(location = 3) uniform samplerExternalOES texSampler2;\n”
“in vec2 vTexCoord;\n”
“out vec4 fragColor;\n”
“void main() {\n”
" fragColor = texture2D(texSampler1, vTexCoord) * 0.5 + texture2D(texSampler2, vTexCoord) * 0.5;\n"
“}\n”; -
Rendering
…
glUniform1i(2, GL_TEXTURE0);
glUniform1i(3, GL_TEXTURE1);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_consumer0->getStreamTextureID());
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_EXTERNAL_OES, m_consumer1->getStreamTextureID());
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
…
However, the same texture image was passed to the fragment shader’s samplerExternalOES texSampler1 and texSampler2 . I’m reviewing the code for any mistakes I made.
Is this still an issue to support? Any result can be shared? Thanks
Hi,
If multiple textures do not work, we suggest composite the frames into single video plane so that it is a single texture.
Thank you for your continued support. I was constantly trying to blend the video input from the two cameras. I want to ask a question, how do I composite frames into one video plane?
Hi,
You can call NvBufferComposite(). Please refer to 13_multi_camera
sample.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.