Why we get data from NvBuffer wrong?

Hi team,
we need capture video and encoded h264 data and save as mp4 format. but we get the wrong image when play back the mp4 file ,the color of image is pink,
we used a mem block to save nv12 data and read_video_frame from the block and fill to NvBuffer, when i used write_video_frame to save the data, it is pink too,so it seemed before i send the data to v4l2_buffer, it was wrong. anyone can help me to check the read video code?

int ArgusCameraCapture::read_video_frame(NvBuffer &buffer)
{

    raw_buf encode_buf_data;
    while (true)
    {
        bool queue_ret = encode_raw_buf_queue->wait_dequeue_timed(encode_buf_data, 1000);

        if (!queue_ret)
        {
            usleep(1 * 1000);

            continue;
        }
        else
        {
            break;
        }
    }
    char *buf = encode_buf_data.mipi_data;

    // // write_test
    // if (test1)
    // {
    //     u1fstream->write(encode_buf_data.mipi_data, 2896 * 1880 * 1.5);
    //     test1=0;
    // }

    for (int encode_i = 0; encode_i < buffer.n_planes; encode_i++)
    {

        NvBuffer::NvBufferPlane &plane = buffer.planes[encode_i];
        // printf("n_planes %d\n", buffer.n_planes);

        std::streamsize bytes_to_read = plane.fmt.bytesperpixel * plane.fmt.width;
        // printf("bytes_to_read %d   plane.fmt.bytesperpixel %d    plane.fmt.width %d  \n", bytes_to_read, plane.fmt.bytesperpixel, plane.fmt.width);

        char *data = (char *)plane.data;

        plane.bytesused = 0;

        int index = 0;
        for (int encode_j = 0; encode_j < plane.fmt.height; encode_j++)
        {
            // printf("1---------------------\n");
            memcpy(data, buf + index, bytes_to_read);
            index += bytes_to_read;
            // printf("buffer->n_planes %d  now%d plane.fmt.height  %d  now  %d   stride %d\n", buffer.n_planes, encode_i, plane.fmt.height, encode_j, plane.fmt.stride);
            data += plane.fmt.stride;
            // printf("plane.fmt.stride %d   plane.fmt.height %d  \n", plane.fmt.stride, plane.fmt.height);
        }

        plane.bytesused = plane.fmt.stride * plane.fmt.height;
        // printf("plane.bytesused %d\n", plane.bytesused);
    }

    encode_mempool_queue->enqueue(encode_buf_data);
    // if (test2){
    //     write_video_frame(ufstream, buffer);
    //     test2=0;
    // }
    return 0;
}

We use example camera_unit_sample to capture video data,It works,and used example 15_multivideo_encode to encode h264 data.
for above code write test , the video data from test1 is ok but test2 is wrong…

Hi,
Orin Nano does not have hardware encoder, so you would need to copy the frame data from NvBufSurface to CPU buffer. And then send to software encoder. Please check read_dmabuf() for reading the frame data from NvBufSurface:

/usr/src/jetson_multimedia_api/samples/common/classes/NvUtils.cpp

And please check

Software Encode in Orin Nano — NVIDIA Jetson Linux Developer Guide 1 documentation

we use orign nx platform, we used V4L2_MEMORY_DMABUF for encoder output_memory_type,but the image is wrong too,image is spliced.

Hi,
The topic was in Orin Nano category. New time please make sure you create the topic in correct category.

For Argus + hardwa rencoding, it is demonstrated in 10_argus_camera_recording sample. Please check the sample and give it a try.

because we need use isp param setting,the 10_argus_camera_recording sample didn’t have found this setting,so we use camera_unit_sample sample,it included camera_unit_sample_ctrls.cpp ,it was we needed,we have been success to capture and save nv12m data,but encoder image was wrong

Hi,
It looks like the pitch of NvBufSurface is not well considered in the application. Please check if the saved nv12m data is valid through YUV viewer.

int index = 0; need move to first for loop, it was fixed.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.