• Hardware Platform (Jetson / GPU)
Jetson Nx
• DeepStream Version
5.0.0-1
• JetPack Version (valid for Jetson only)
4.4-b144
Hello. I try to feed data through appsrc
in deepstream-test2 example. I made a following pipeline:
appsrc -> nvvideoconvert -> capsfilter -> streammux -> nvinfer (decoder) -> tracker -> nvinfer (car color) -> nvinfer (car manufacturer) -> nvinfer (vehicle type) -> fakesink
Here is how I create a source and capsfilters:
source = Gst.ElementFactory.make("appsrc", "source")
source.set_property('caps',
Gst.caps_from_string(','.join([
'video/x-raw',
'format=RGBA',
'framerate=1000/1',
F'width={IM_W}',
F'height={IM_H}',
])))
if not source:
sys.stderr.write(" Unable to create Source \n")
caps_str = Gst.Caps.from_string("video/x-raw(memory:NVMM)")
capsfilter = Gst.ElementFactory.make("capsfilter", "filter")
capsfilter.set_property("caps", caps_str)
Here is my simple test:
pipeline.set_state(Gst.State.PLAYING)
try:
#loop.run()
th = threading.Thread(target = loop.run, args=())
th.start()
except:
pass
# prefetching set of images, so we will not decode them in the test-loop
cap = cv2.VideoCapture(sys.argv[1])
imgs = []
for i in range(100):
ret, im_orig = cap.read()
imgs.append(im_orig)
start_time = time.time()
count = 0
while True:
im_orig = imgs[count%len(imgs)]
im = cv2.cvtColor(im_orig, cv2.COLOR_BGR2RGBA)
frame = im.tobytes()
buf = Gst.Buffer.new_wrapped_full(Gst.MemoryFlags.READONLY, frame, len(frame), 0, None)
res = source.emit('push-buffer', buf)
#print(res)
curr_time = time.time()
count += 1
if curr_time - start_time > TEST_TIME:
break
print(F"N pushed frames: {count}. Test time is {TEST_TIME} seconds")
It shows me that it pushed ~4000 frames per 10 seconds. So, it’s ~400 FPS. But I inserted a counter in sink_pad_buffer_probe
and I see that only ~900 frames are really processed.
Am I doing something wrong or deepstream/gstreamer can drop some buffers?
Is there a way to ensure that a frame was really processed?