H.264/H.265 encoding using Jetson Nano GPU

Hello,

I would like to use jetson Nano to do GPU based H.264/H.265 encoding.

I saw in another thread that FFMPEG is not supported on jetson Nano and Gstreamer should be use instead.

My first question is : FFMPEG GPU based will be supported in the futur on Jetson Nano?

Second, Is it possible to have a concret example with Gstreamer and steps to follow in order to encode a video in H.264/H.265, that will be very helpful!!

Assume I have a video called “input.mov” and want to encode as “output.mp4”…

Thanks in advance for your help,

Regards,

Hi Malik, ffmpeg doesn’t use the Nano’s hardware encoder or decoder, you can run it but it will be CPU-only.

Recommend looking at the L4T Accelerated GStreamer User Guide for example pipelines of encoding the video: [url]https://developer.nvidia.com/embedded/dlc/l4t-accelerated-gstreamer-guide-32-1[/url]

1 Like

Would be nice if ffmpeg with GPU acceleration could be supported as many people using ffmpeg and not gstreamer.

1 Like

You wouldn’t want to use the Nano’s GPU for video compression/decompression, because it has dedicated video encoder and decoder hardware which do it (freeing up the GPU). These are also supported through V4L2 in the L4T Multimedia API.

I can’t link against libnvcuvid.so, so can you confirm that NVIDIA VIDEO CODEC SDK is not supported on Jetson Nano?

Hi Dusty_nv,

Thanks for you reply and the link for “ACCELERATED GSTREAMER USER GUIDE”

There is something I don’t really understand.

Assuming, I have a video input.mov and I want an encoded video ouput.mp4,

How can I modify the following command line with input and ouput videos above?

Ps: I tried to replace ‘videotestsrc’ by ‘input.mov’ and ‘location=output.mp4’ but it didn’t work out.

gst-launch-1.0 videotestsrc !
‘video/x-raw, format=(string)I420, width=(int)640,
height=(int)480’ ! omxh264enc !
‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse !
qtmux ! filesink location=test.mp4 -e

thanks,

That’s correct, Jetson uses a different hardware video encoder/decoder that doesn’t use GPU/CUDA like nvcuvid, and it has different sw (GStreamer/V4L2)

Can somoene plz help with a concrete example that I can try,

With input.mov, ouput.mp4 and

gst-launch-1.0 videotestsrc !
‘video/x-raw, format=(string)I420, width=(int)640,
height=(int)480’ ! omxh264enc !
‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse !
qtmux ! filesink location=test.mp4 -e

Thanks in advance for your help,

Hi Malik, have you first been able to run the H.264 encoding example pipelines from the L4T Accelerated GStreamer User Guide? Once you have run those OK, you can modify them for your particular file format.

Typically you can search online for other GStreamer pipelines that use the specific formats you want, as GStreamer has a large user base, and then add the relevant bits to the NVIDIA-accelerated pipelines from the user guide.

Hi dusty_nv,

thanks for your reply.

Yes when I used the following example :

gst-launch-1.0 videotestsrc !
‘video/x-raw, format=(string)I420, width=(int)640,
height=(int)480’ ! omxh264enc !
‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse !
qtmux ! filesink location=test.mp4 -e

It worked, but know i want to use “real” video.

I tried to replace videotestsrc by a real video called “input.mov” but it didn’t work.

So I wanted to know if I am using the example correctly, if not then If it’s possible to have an example with “input.mov” instead of videotestsrc.

Thanks,
Bilal

Hi Bilal, when playing video files from disk, you need to use a demuxer element to parse the video file. For mov and mp4’s, you use the qtdemux element.

See this example from the user guide:

gst-launch-1.0 filesrc location=<filename.mp4> ! \
 qtdemux name=demux demux.video_0 ! queue ! h264parse ! omxh264dec ! \
 nveglglessink -e

Hi Dusty_nv,

Thanks for this example but if I well understood, this example is for decoding where I want to do an encoding…

Do you think below can work? if it’s correct syntax?

gst-launch-1.0 filesrc location=<input.mov> !
‘video/x-raw, format=(string)I420, width=(int)640,
height=(int)480’ ! omxh264enc !
‘video/x-h264, stream-format=(string)byte-stream’ ! h264parse !
qtmux ! filesink location=output.mp4 -e

thanks
Bilal

First you need to decode the MOV file so that you can then re-encode it as MP4, so it would be a combination of those two pipelines. I think something like this may work, although needs testing:

gst-launch-1.0 filesrc location=filename.mov ! \
 qtdemux name=demux demux.video_0 ! queue ! h264parse ! omxh264dec ! \
 omxh264enc ! 'video/x-h264, stream-format=(string)byte-stream' ! \
 h264parse ! qtmux ! filesink location=filename.mp4 -e

Also refer to the Video Transcode section of the L4T Accelerated GStreamer User Guide, starting on page 32.

If all you want to do is change MOV container to MP4 format without re-encoding, that may be easier to do with another tool or pipeline that doesn’t need hardware codec (because re-encoding may not be needed), but I’m not sure about this.

how can tell me when I used jetson nano to encode h264 video, I set framerate to 30, but the stream encoded is always 25, why??

Hi Dusty_nv,
Can you tell me how to change encode framerate to 30fps? when I used jetson nano samples encode, the encoder output stream is always 25fps.

Thanks,
GCE

Replied at
https://devtalk.nvidia.com/default/topic/1051566/jetson-nano/jetson-nano-encode-h264-video-framerate-can-not-change-/post/5337441/#5337441

And can we assume the dedicated HW encoder/decoder is faster than GPU/CUDA? What if the main purpose of the nano is this encoding, which one should we use? Is ffmpeg GPU HW support working on the nano?

Thanks!

I think you may be able to just remux without a decoder and encoder in there. ffmpeg will let you use the copy codec to do this as well. I used to use it on a Pi Zero to remux video streams and it only used about ~10% cpu (and no gpu) for 1080p30 h264 rtsp streams. Just try:

sudo apt install ffmpeg
ffmpeg -i video.mov -c:av copy video.mp4

If the video within is h264 or h265 this should work. Often no need for encoding at all if you’re just changing containers.

In gstreamer I think it would be something like:

gst-launch-1.0 filesrc location=video.mov ! qtdemux ! qtmux ! filesink location=filename.mp4 -e

I just tested that and it seems to work fine on a sample. I think you also might be able to just rename the .mov to .mp4 since the containers are the same, internally, @DaneLLL is this correct?

Edit: You can remux webm or matroska or webm like this:

gst-launch-1.0 filesrc location=big-buck-bunny_trailer.webm ! matroskademux ! qtmux ! filesink location=bbb.mp4 -e

Since vp8 and vp9 are supported by the .mp4 container. You can probably follow that chart to convert quite a bit of stuff without ever needing to re-encode.

3 Likes

can we assume the dedicated HW encoder/decoder is faster than GPU/CUDA?

It’s dedicated hardware, so whether not it’s “faster” or not, it allows you to do other things with the GPU/CPU at the same time. It’s better, for sure. AFAIK ffmpeg decode is supported on the nano, but not encode, and you have to add these apt repos to your lists:

1 Like

I see that Jetson V4L2 interface only supports 8 ROI per frame. For my application I need QPmap per MB similar to Video Codec SDK. There we set NV_ENC_RC_PARAMS::qpMapMode to NV_ENC_QP_MAP_DELTA when initializing the rate control, then pass Qp delta map when we pass video frame to encoder within NV_ENC_PIC_PARAMS struct, where NV_ENC_PIC_PARAMS::qpDeltaMap is an array, which has signed 8 bits per MB for qp delta map for that block.

How can I have access qpMap in Jetson Nano or AGX?