Enable Night mode in Jeton Nano

Do you mean you would like to encode only GREY data(Y plane) into h264 stream? The input format to encoder is NV12 or YUV420, so possible solution is to clean U/V plane before feeding the buffer to encoder.

Do you mean you would like to encode only GREY data(Y plane) into h264 stream?

Yes @DaneLLL

The input format to encoder is NV12 or YUV420, so a possible solution is to clean U/V plane before feeding the buffer to encoder.

We have cleaned the UV buffer, and it is stored in a character array.
Can we use Raw2NvBuffer to copy from character array to NvBuffer? Or please provide some insights, to copy from character array to Buffer(NvBuffer)?

Could you please provide some inputs, how we can feed the buffer encoder(H264)?

Note: We are using the 10_camera_recording MMAPI sample

Hi,
Please port nvbuff_do_clearchroma() in 12_camera_v4l2_cuda to 10_camera_recording. The function cleans U/V plane(s). And then you shall get encoded stream in GREY.

@DaneLLL Thanks for the info

We have ported the nvbuff_do_clearchroma() into the 10_camera_recording_sample and called by using the following method.

if(!nvbuff_do_clearchroma(dmabuf_fd))
ERROR_H(“Failed to clear chroma”);

We were able to capture the encoded grey data(Y plane), not completely.
While playing the (H264) encoded file, Initially encoded YUV data is coming and after that it switches to encoded grey(Y plane) data.

Could you please confirm whether the issue is due to,
CHECK_ERROR(m_VideoEncoder->output_plane.qBuffer(v4l2_buf, dmabuf)); API is called, to queue to the encoder two times inside the ConsumerThread::threadExecute()
We have applied our logic (GREY8), just before the above API is called 2nd time inside the ConsumerThread::threadExecute()?

Could you please provide us some inputs, to capture the complete encoded grey (Y plane) data from the above details?

Hi,
Yes, please also put the logic of cleaning UV plane in first qBuffer. After camera captures frames, the plane is re-filled with valid data and you would need to clean it for every frame.

@DaneLLL thanks for the reply

Yes, please also put the logic of cleaning UV plane in first qBuffer. After camera captures frames, the plane is re-filled with valid data and you would need to clean it for every frame.
[/quote]

As of now, we have placed the logic of cleaning the UV plane before calling the first and second qBuffer. Now we have successfully captured the encoded gray data.

Could you please clarify, what is the need to put the two qBuffer in the 10_camera_recording MMAPI sample, so that we can apply the logic only once?

Hi,
All fame buffers are queued into encoder before calling dqBuffer. It works like:

for(buffer_index < buffer_number)
{
    stream->acquireBuffer();
    qBuffer_to_encoder;
}
while ()
{
    dqBuffer_from_encoder;
    stream->releaseBuffer();

    stream->acquireBuffer();
    qBuffer_to_encoder;
}

Yo can change to:

while ()
{
    if (all_buffers_have_qeueued)
    {
        dqBuffer_from_encoder;
        stream->releaseBuffer();
    }
    stream->acquireBuffer();
    qBuffer_to_encoder;
}

@DaneLLL Thanks for the input

while ()
{
    if (all_buffers_have_qeueued)  /*Didn't get this part of code */
    {
         dqBuffer_from_encoder;
        stream->releaseBuffer();
    }
    stream->acquireBuffer();
    qBuffer_to_encoder;
}

Could you please provide some clarity on, all_buffers_have_qeueued in the above following code, like we have to do any logic in the 10_camera_recording MMAPI sample?

Hi,
If there are 5 buffers, please call qBuffer to enqueue them to encoder. After all the 5 buffers are queued, you can call dqBuffer to dequeu the buffers from encoder.

After initializing the encoder, encoder does not own any buffer and please avoid calling dqBuffer in the beginning. Have to enqueue the buffers in the beginning.

@DaneLLL Thanks for the reply

  while (True)
  {
        for (int bufferIndex = 0; bufferIndex < MAX_ENCODER_FRAMES; bufferIndex++) /*MAX_ENCODER_FRAMES = 5 */
                {
                 stream->acquireBuffer();
                 qBuffer_to_encoder;
              }
              dqBuffer_from_encoder;
              stream->releaseBuffer();
}
            

Could you please confirm the logic of the above code is same as you mentioned earlier, or please provide us some insights, if any issue is there?

Hi,
We suggest apply your thoughts to the sample and give it a try. Ideally it should work. If it doesn’t work as your expectation, please follow the default sample and put nvbuff_do_clearchroma() in the for loop and while loop.

@DaneLLL Thanks for the response

We suggest apply your thoughts to the sample and give it a try. Ideally it should work. If it doesn’t work as your expectation, please follow the default sample and put nvbuff_do_clearchroma() in the for loop and while loop.

We will try and update.

We have observed the presence of color elements in the bottom side of the frames on the encoded gray data, even we have cleaned the UV data by calling nvbuff_do_clearchroma() as suggested.
Could you please let us know how to fix this issue?

Note:As of now, we have called nvbuff_do_clearchroma() in the for loop and while loop on the 10_camera_recording MM API sample

Hi,
Not sure but the plane may not be completely cleaned. Please try like:

            NvBufferParams par;
            int size;
            NvBufferGetParams (dmabuf_fd, &par);
            void *ptr_uv;
            NvBufferMemMap(dmabuf_fd, 1, NvBufferMem_Write, &ptr_uv);
            NvBufferMemSyncForCpu(dmabuf_fd, 1, &ptr_uv);
            memset (ptr_uv, 0x80, par.psize[1]);
            NvBufferMemSyncForDevice (dmabuf_fd, 1, &ptr_uv);
            NvBufferMemUnMap(dmabuf_fd, 1, &ptr_uv);

This is to clean UV plane in NV12.

@DaneLLL Thanks for the response

Not sure but the plane may not be completely cleaned. Please try like:

            NvBufferParams par;
            int size;
            NvBufferGetParams (dmabuf_fd, &par);
            void *ptr_uv;
            NvBufferMemMap(dmabuf_fd, 1, NvBufferMem_Write, &ptr_uv);
            NvBufferMemSyncForCpu(dmabuf_fd, 1, &ptr_uv);
            memset (ptr_uv, 0x80, par.psize[1]);
            NvBufferMemSyncForDevice (dmabuf_fd, 1, &ptr_uv);
            NvBufferMemUnMap(dmabuf_fd, 1, &ptr_uv);

We will try it out and update

We suggest apply your thoughts to the sample and give it a try. Ideally it should work. If it doesn’t work as your expectation, please follow the default sample and put nvbuff_do_clearchroma() in the for loop and while loop.
On this issue

We had tried the following logic as suggested earlier to avoid twice calling the UV cleaning function in the for and while loop in the 10_camera_recording MM API sample

 while (!m_gotError)
 {
                for (int bufferIndex = 0; bufferIndex < MAX_ENCODER_FRAMES; bufferIndex++) /*MAX_ENCODER_FRAMES = 5 */
                {
                        printf("\n Inside the encoder function\n");
                        v4l2_buf.index = bufferIndex;
                        Buffer* buffer = stream->acquireBuffer(); /*stream->acquireBuffer();*/
                        /* Convert Argus::Buffer to DmaBuffer and queue into v4l2 encoder
*/
                        DmaBuffer *dmabuf = DmaBuffer::fromArgusBuffer(buffer);
                        CHECK_ERROR(m_VideoEncoder->output_plane.qBuffer(v4l2_buf, dmabuf)); /*qBuffer_to_encoder;*/
);
                }
                        NvBuffer *share_buffer;
                        CHECK_ERROR(m_VideoEncoder->output_plane.dqBuffer(v4l2_buf, NULL,
                                                            &share_buffer, 10/*retry*/));  /*dqBuffer_from_encoder;*/

                        DmaBuffer *dmabuf = static_cast<DmaBuffer*>(share_buffer);
                        stream->releaseBuffer(dmabuf->getArgusBuffer()); /* stream->releaseBuffer();*/


 }

But we were getting the following error logs

PRODUCER: Launching consumer thread
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
875967048
842091865
create video encoder return true
H264: Profile = 100, Level = 50
Inside the encoder function
PRODUCER: Starting repeat capture requests.
Inside the encoder function
Inside the encoder function
Inside the encoder function
Inside the encoder function
Inside the encoder function
Inside the encoder function
[ERROR] (NvV4l2ElementPlane.cpp:256) Output Plane:Error while Qing buffer: Device or resource busy
Error generated. main_1.cpp, threadExecute:470 m_VideoEncoder->output_plane.qBuffer(v4l2_buf, dmabuf) failed
Error generated. /usr/src/jetson_multimedia_api/argus/samples/utils/Thread.cpp, threadFunction:132 (propagating)
[ERROR] (NvV4l2ElementPlane.cpp:178) Capture Plane:Error while DQing buffer: Broken pipe

Could you please provide some insights where we are wrong on the above logic, from these error logs?

Hi,
It seems better to have code same as 10_camera_recording. Have for loop first and then while loop. The code looks confusing after moving the for loop into while loop. Would like to suggest keep the code same as 10_camera_recording.

@DaneLLL Thanks for the response

It seems better to have code same as 10_camera_recording. Have for loop first and then while loop. The code looks confusing after moving the for loop into while loop. Would like to suggest keep the code same as 10_camera_recording.

We will follow the same as suggested

We have been running the 10_camera_recording MM API sample with Non blocking mode with the below change in the encoder, and the following errors have arrived while running the sample application.

m_VideoEncoder = NvVideoEncoder::createVideoEncoder(“enc0”,O_NONBLOCK);

PRODUCER: Launching consumer thread
Opening in O_NONBLOCKING MODE
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
875967048
842091865
H264: Profile = 100, Level = 50
PRODUCER: Starting repeat capture requests.
Error generated. multimediaapihandler.cpp, encoderCapturePlaneDqCallback:667 Failed to dequeue buffer from encoder capture plane
[ERROR] (NvV4l2ElementPlane.cpp:256) Output Plane:Error while Qing buffer: Device or resource busy
Error generated. multimediaapihandler.cpp, threadExecute:556 m_VideoEncoder->output_plane.qBuffer(v4l2_buf, NULL) failed
Error generated. /home/nano/videomanager/argus/samples/utils/Thread.cpp, threadFunction:132 (propagating)

Could you please let us know the how to fix this issue to run the sample in the non-blocking mode?
Note : for multithreading, we need to run the sample as a non-blocking mode

Hi krishnaprasad.k,

To enable a “night mode” by converting the image to grayscale appearance you could try the saturation property of the nvarguscamerasrc element. For example:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 saturation=0 ! nvvidconv ! autovideosink

Jafet Chaves,
Embedded SW Engineer at RidgeRun
Contact us: support@ridgerun.com
Developers wiki: https://developer.ridgerun.com/
Website: www.ridgerun.com

Hi,
For encoding in non-blocking mode, please refer to 01_video_encode. It requires a polling thread. Please note CPU usage is higher when comparing to blocking mode, due to the polling thread.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.