**• Hardware Platform (Jetson / GPU)**nvidia geforce rtx 3060 • DeepStream Version sdk 7.1 • TensorRT Version 10.3
**• NVIDIA GPU Driver Version (valid for GPU only)**560.35.03
def osd_sink_pad_buffer_probe(pad, info, u_data):
# Get the GstBuffer
gst_buffer = info.get_buffer()
if not gst_buffer:
return Gst.PadProbeReturn.OK
# Get batch metadata from the buffer
batch_meta = pyds.gst_buffer_get_nvds_batch_meta(hash(gst_buffer))
if not batch_meta:
return Gst.PadProbeReturn.OK
# Iterate through the frames in the batch
l_frame = batch_meta.frame_meta_list
while l_frame is not None:
try:
frame_meta = pyds.NvDsFrameMeta.cast(l_frame.data)
image_width = frame_meta.source_frame_width
except StopIteration:
break
Here frame_meta.source_frame_width is recieving value as 0. i have set streammux.set_property(‘width’, 1920)
streammux.set_property(‘height’, 1080) and calling this probe function after it. please let me know why this is happening .
Is osd_sink_pad_buffer_probe set on sink of nvosd?I suggest getting the values before nvmultistreamtiler because these values are meaningless after nvmultistreamtiler.
thank you i have used the probe function sink of the nvmultistreamtiler and its working perfectly.
Now i am trying to stitch the multiple streams since the cameras are adjacent to each other with minor overlapping but i have faced an issue when i am sending this stiched stream since the maximum resolution accepted by deepstream is lower than the stitched stream . how can i overcome this and use the multi streams as a single stitched stream.
why do you want to stith the multiple streams? is there any benefit in your application?
what are the resolution and fps of the multiple streams?
could you elaborate on “i have faced an issue”? and what do you mean about “the maximum resolution accepted by deepstream is lower than the stitched stream”? could you share the reletated doc? If inputting a high resolution source, we usually use nvdspreprocess to set many ROIs, then let nvinfer do inference on the ROIs.
I have used the uridecodebin element ,as you can see the screengrab i have uploaded previously the resolution is 3744x1088 with codec H264. The error occurs beacuse the resolution is exceeding the resolution supported by deepstream.
Testing on my rtx 3060 with DS7.1. I can’t reproduce this hardware decoder issue. In terms of encoding-decoding, is there any difference with your test? could you use my method to reproduce the issue? Thanks!
encode-decode.txt (6.2 KB)
2. Did you use nvv4l2h264enc to encode stream? If using H265 to encode, will the hardware decoding issue remain? AYK, H265 is more suitable for high-definition video encoding.