so, as you can see, the camera gives me UYVY images, i transform it using VIC to be readable by the encoder.
but I’m struggling to transfer data to the encoder.
I’m using V4L2 ( and i’m not intend to use Gstreamer related to others constraints). so in my flow in use the NvBufferTransform to change the color format. but then i struggle to send data to the encoder, and there is no example thats doing that.
the buffer between camera and VIC, and the one between VIC and NVENC are DMA, to get best performance.
this example use libargus, i’m using V4L2, i already saw it ( as others examples ! ) so i don’t really understand why you redirected me on this example
to be more precise in what i’m trying to do, i just want to send the output DMA buffer form the vic, to the input of the NvEnc.
i thought it was just a FD to transmit… but obvouisly, seems to be more complicated… do i need to set multiple DMA buffers for the output of the VIC ?
i’ve done some modfication without proper resulte, according to the examples i am able to encode from the DMA, but i am clearly wrong in the dma buffer transmission between VIC and NvEnc.
apparently the “output plane” from the Nvenc is a circular buffer. in the other way, the VIC is just pushing the datas out on a single memory space ( using NvBufferTransform() ).
i think the purpose of my struggles is "only " to replace the read_video_filefunction from the example 01.
but i really don’t know how to handle this.
i have read all the examples, and docs related to the Jetson Linux API
i will try to integrate this in my code, to see if everything is OK. because there is a little problem, the patch seems to freeze a few moment after launching the app. the application is still working but the render is frozen.
i suspect something with the Jetpack version, but still, I’m not sure. (*I’m working on 4.4.1, with camera on MIPI CSI interface)
okay, i’ve sold the problem with the patch thats add the Nvencoder for 12_camera_v4l2_cuda sample, by adding a “printf” a the begining of the " if (fds[0].revents & POLLIN) { ".
might be a sync problem behind, so this is probably not the best solution
i have adapted the code to my usage but I’m still in error.
i can’t get any encoded file because it catch a “nullptr” BEFORE anything happens ( so before it reads the fd). i don’t know what i can do … i tryed several things…
i confirm that the fd is transfered …
what is the shared buffer ? may i have to use it ?
now it works fine. but i’m a little bit worried because i encode frames but when i’m reading it using VLC, the content is not fluid : some frames seems to be drop when encoded.
Hi,
Please run 00_video_decode to decode the h264 stream. Check if the h264 stream is valid. If you also observe non-fluid playback, it is very likely the source drops frames.