Hi,
I’m running a pipeline for processing multiple jpeg images using “appsrc”,
a simple example pipeline:
pipeline = “appsrc name=source is-live=True format=GST_FORMAT_TIME caps=image/jpeg,framerate=0/1 ! jpegdec ! jpegenc ! multifilesink location=./img_%d.jpeg”
I’m reading from the 10,000 jpeg frames and send these frames (buffered_image) sequentially using .emit(“push-buffer”, buffered_image)
However, the pipeline does not start right away, I had to wait some time to get the pipeline start processing.
And, especially, the buffered multiple frames are getting processed in the pipeline together when I send a different scene image frame – let’s say, I send 10 similar image frames (‘A1’, ‘A2’, … , ‘A10’) and after this I send 1 image frame which is totally different (‘B1’) from the previous 10 similar image frames. Note that the difference here is the scene of each frame(‘A’ scene = city with full of cars and ‘B’ scene = mountain with a lot of trees).
Only it works fine when I call .emit(…) with the end signal, such as .emit(“end-of-stream”, buffered_image).
Otherwise, the pipeline does not process a sequence of similar images unless I send a different image.
According to the manual, the pipeline should process each image frame because I set “is-live=True” which means real-time processing on the sequential image frames.
I’ve tested with various settings, for example, differentiated timestamps, but nothing worked.
The pipeline seem not started even though I send 20 frames that are slowly change the background and objects in frame. Only it initiated the processing of all the buffered frames once it gets totally different image.
Any idea? or solution for this?
Is this a bug of the fuction, “emit(…)” ?
Any comments will be appreciated. Thanks!
– Sean.
My environments are:
- Jetson AGX Orin
- JetPack 5.0.2 (L4T R35.1.0)
- Deepstream6.1
- Python 3.8