A question about the "frontend" sample error

Open two terminals and run the frontend sample simultaneously on each terminal.

  1. Running the frontend sample on the first terminal succeeds.

====================================================
nvidia@nvidia-desktop:~$ ./frontend
[INFO] (NvEglRenderer.cpp:110) Setting Screen width 640 height 480
PRODUCER: Creating output stream
PRODUCER: Launching consumer thread
Opening in BLOCKING MODE
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 8
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
892744264
842091865
NVMEDIA: H265 : Profile : 1
[6304136.880325] enc0: Waiting until producer is connected…
Opening in BLOCKING MODE
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 8
Opening in BLOCKING MODE
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
892744264
842091865
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 8
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8
892744264
842091865
[6304137.005511] enc2: Waiting until producer is connected…
NVMEDIA: H265 : Profile : 1
[6304137.005708] enc1: Waiting until producer is connected…
NVMEDIA: H265 : Profile : 1
PRODUCER: Starting repeat capture requests.
[6304137.016602] enc0: Producer has connected; continuing.
[6304137.017171] enc1: Producer has connected; continuing.
[6304137.020688] enc2: Producer has connected; continuing.
NVMEDIA_ENC: bBlitMode is set to TRUE
NVMEDIA_ENC: bBlitMode is set to TRUE
NVMEDIA_ENC: bBlitMode is set to TRUE

  1. Running the frontend sample on the second terminal fails.

========================================================================
nvidia@nvidia-desktop:~$ ./frontend
[INFO] (NvEglRenderer.cpp:110) Setting Screen width 640 height 480
Error generated. main.cpp, runArgusProducer:97 Failed to get ICaptureSession interface

  1. JetPack info

========================================================================
nvidia@nvidia-desktop:~$ cat /etc/nv_tegra_release
R32 (release), REVISION: 5.1, GCID: 26202423, BOARD: t186ref, EABI: aarch64, DATE: Fri Feb 19 16:50:29 UTC 2021

  1. In the second frontend sample run, both NANO+IMX219 and TX2 show the same failure message.
    How can I run more than one frontend sample?

Hi,
If you have multiple cameras, yo would need to modify the sample to launch another sensor. Please refer to how the option is implemented in 10_camera_recording:

  -i        Set camera index [Default 0]

I use only one sensor on both NANO+IMX21 and TX2.
I want to run multiple frontend samples with one sensor.

Hi,
This case is not supported in default frontend sample. You would need to customize the sample to create multiple OutputStream/consumer, and apply the TensorRT engine to each consumer.

We disabled inference with “ENABLE_TRT := 0”.
The OutputStream/consumer you mentioned needs customization is provided by libargus.
Which parts or classes of libargus do I need to customize?

Hi,
Not sure about your use-case. Do you need multi encoding threads? After disabling ENABLE_TRT, there are 3 encoding threads in running frontend sample:

    VideoEncodeStreamConsumer consumer1("enc0", "output1.h265", Size2D<uint32_t>(640, 480));
    VideoEncodeStreamConsumer consumer2("enc1", "output2.h265", Size2D<uint32_t>(1280, 720));
    VideoEncodeStreamConsumer consumer3("enc2", "output3.h265", Size2D<uint32_t>(1920, 1080));
    consumers.push_back(&consumer1);
    consumers.push_back(&consumer2);
    consumers.push_back(&consumer3);

We are using multiple processes (Encoding/AutoFocus/Recording/other purposes) for different functions.
Could you explain how to customize it so that it can be used by multiple processes.

Hi,
For this use-case we would suggest use Jetpack 4.6(r32.6.1) so that you can use functions for passing NvBuffer through processes. Please take a look at the discussion in
How to share the buffer in process context?

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.