What is the best way to quickly draw information on the original frame and encode it as JPG?

My program was modified based on the deepstream-app example, and running on JetsonNX. Now I need to add a function to compress the original frame into JPG and send it over the network. Before encoding it as JPG, I need to draw object detection box, OSD information, area boundary box, etc on the original frame. In addition, my program needs to be as efficient as possible with JPG encoding. How should this feature be added, and can you recommend a general implementation?

It should be summed up in three steps:

  1. Retrieve the original frame from a data structure at some stage of Deepstream.
    ----> ( But, Which stage? Which element? In what way? )
  2. After getting the original frame, draw the information I need on it.
    ----> ( Using OpenCV, or what? )
  3. Encode the whole frame after drawing into JPG format. (The purpose of this step is to reduce the amount of data being sent over the network.)
    ----> ( What is the fastest way to encode? Is it possible to retrieve the encoded JPEG stream directly from Deepstream? )

My running environment information:
• Hardware Platform (Jetson NX)
• DeepStream Version : 5.0
• JetPack Version : 4.4.1
• Requirement details : As mentioned above

An existing solution is to encode into h264/h265 stream and send out through RTSP. You may try default config file in


And set to type=4 in sink group:
DeepStream Reference Application - deepstream-app — DeepStream 5.1 Release documentation

For achieving 30fps, we would suggest use this existing implementation. If low frame rate is good in your usecase, you may try to customize create_udpsink_bin() to run like:

... ! nvdsosd ! nvvideoconvert ! video/x-raw,format=I420 ! jpegenc ! rtpjpegpay ! udpsink

Hello @DaneLLL @shenshen927
Do you have another way to quickly draw information on the original frame and encode it as JPG?
My goal is get jpeg frame after nvosd step, then convert it to CPU and send to our dataserver.

Hi tienduchoangvn,

Please help to open a new topic if it’s still an issue. Thanks