I would like to use jetson Nano to do GPU based H.264/H.265 encoding.
I saw in another thread that FFMPEG is not supported on jetson Nano and Gstreamer should be use instead.
My first question is : FFMPEG GPU based will be supported in the futur on Jetson Nano?
Second, Is it possible to have a concret example with Gstreamer and steps to follow in order to encode a video in H.264/H.265, that will be very helpful!!
Assume I have a video called “input.mov” and want to encode as “output.mp4”…
You wouldn’t want to use the Nano’s GPU for video compression/decompression, because it has dedicated video encoder and decoder hardware which do it (freeing up the GPU). These are also supported through V4L2 in the L4T Multimedia API.
That’s correct, Jetson uses a different hardware video encoder/decoder that doesn’t use GPU/CUDA like nvcuvid, and it has different sw (GStreamer/V4L2)
Hi Malik, have you first been able to run the H.264 encoding example pipelines from the L4T Accelerated GStreamer User Guide? Once you have run those OK, you can modify them for your particular file format.
Typically you can search online for other GStreamer pipelines that use the specific formats you want, as GStreamer has a large user base, and then add the relevant bits to the NVIDIA-accelerated pipelines from the user guide.
Hi Bilal, when playing video files from disk, you need to use a demuxer element to parse the video file. For mov and mp4’s, you use the qtdemux element.
First you need to decode the MOV file so that you can then re-encode it as MP4, so it would be a combination of those two pipelines. I think something like this may work, although needs testing:
Also refer to the Video Transcode section of the L4T Accelerated GStreamer User Guide, starting on page 32.
If all you want to do is change MOV container to MP4 format without re-encoding, that may be easier to do with another tool or pipeline that doesn’t need hardware codec (because re-encoding may not be needed), but I’m not sure about this.
Hi Dusty_nv,
Can you tell me how to change encode framerate to 30fps? when I used jetson nano samples encode, the encoder output stream is always 25fps.
And can we assume the dedicated HW encoder/decoder is faster than GPU/CUDA? What if the main purpose of the nano is this encoding, which one should we use? Is ffmpeg GPU HW support working on the nano?
I think you may be able to just remux without a decoder and encoder in there. ffmpeg will let you use the copy codec to do this as well. I used to use it on a Pi Zero to remux video streams and it only used about ~10% cpu (and no gpu) for 1080p30 h264 rtsp streams. Just try:
I just tested that and it seems to work fine on a sample. I think you also might be able to just rename the .mov to .mp4 since the containers are the same, internally, @DaneLLL is this correct?
Edit: You can remux webm or matroska or webm like this:
Since vp8 and vp9 are supported by the .mp4 container. You can probably follow that chart to convert quite a bit of stuff without ever needing to re-encode.
can we assume the dedicated HW encoder/decoder is faster than GPU/CUDA?
It’s dedicated hardware, so whether not it’s “faster” or not, it allows you to do other things with the GPU/CPU at the same time. It’s better, for sure. AFAIK ffmpeg decode is supported on the nano, but not encode, and you have to add these apt repos to your lists:
I see that Jetson V4L2 interface only supports 8 ROI per frame. For my application I need QPmap per MB similar to Video Codec SDK. There we set NV_ENC_RC_PARAMS::qpMapMode to NV_ENC_QP_MAP_DELTA when initializing the rate control, then pass Qp delta map when we pass video frame to encoder within NV_ENC_PIC_PARAMS struct, where NV_ENC_PIC_PARAMS::qpDeltaMap is an array, which has signed 8 bits per MB for qp delta map for that block.
How can I have access qpMap in Jetson Nano or AGX?