H.264 encoding/decoding

Hi nvidia’s community,

I want to use zed 2i camera as usb camera and capture frames then encode it with h.264 and send over websocket to another device in local network.

I can do this with jpg encoding, but with h.264 isnt working ,so please I want assistance in details how to use h.264 with my python code step by step

I use jetson orin agx , jetpack 5.0.2 , zed sdk s 4.0.8 , python 3.8.x, streaming in HD720/30 fps , Is there any other info needed?

I dont want to use gstreamer at all

Hi,
If the camera supports v4l2 interface, you can try the sample to capture frame data into NvBufSurface and encode to H264 stream:

/usr/src/jetson_multimedia_api/samples/12_v4l2_camera_cuda/

We are not familiar with streaming through websocket. Would need other users to share experience about how to tream out the H264 stream.

And Jetpack 5.0.2 is old. Would suggest upgrade to 5.1.3 or 6.0GA.

Thank for replying,

  1. Have to upgrade jetpack? If yes, and which be better jetpack 5.1 or 6 ?

  2. Please guide me hoe to upgrade jetpack because it will be the first time to do this.

  3. Multimedia Api is installed with jetpack or have to install it seperated

  4. If i will so local streaming using webrtc being better and have any info related to this ?

I think my cam zed 2i isn’t supporting v4l2, so i want to use open cv in python to capture frames from camera and after this encode it with h264 , i see can do this using h264 hardware accelerated , can u assist me also how to do this ?

Please assist in each question because im so confused due to this

Hi,
Please have a host PC in Ubuntu 20.04 or 22.04. So that you can install system image and SDK Components through SDKManager. Please check

Download and Run SDK Manager — SDK Manager 2.1.0 documentation
Install Jetson Software with SDK Manager — SDK Manager 2.1.0 documentation

If you can capture the frame date to cv::Mat, you can try cv2.VideoWriter() in the samples to save to a video file or stream out through UDP:
Displaying to the screen with OpenCV and GStreamer - #9 by DaneLLL
Stream processed video with OpenCV on Jetson TX2 - #5 by DaneLLL

Thank u,
But I told u before I dont want to use gstreamer,because my project is already done perfect else encoding with h264, so its difficult switching to another approach right now. only I need is to encode the stream within h264 which is available in hardware accelerated in jetson orin agx, but how to use it successfully is my concern!!

Hi,
Beside gstreamer, the other option is jetson_multimedia_api. Please refer to the sample and see integrate NvVideoEncoder to your use-case:

/usr/src/jetson_multimedia_api/samples/01_video_encode
/usr/src/jetson_multimedia_api/samples/unittest_samples/encoder_unit_sample/

Thank u for ur reply,
Mr. Danelll i want to check with can I use PyNvCodec to use encoding on gpu not on cpu ?

Hi,
Does PyNvCodec mean this:
pynvcodec · PyPI

This should be supported on x86 PC with dGPU. And does not work on Jetson platforms. On Jetson platforms, we support gstreamer and jetson_multimedia_api.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.