Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) AGX Jetson Orin • DeepStream Version 6.1 • JetPack Version (valid for Jetson only) 5.0.2 • TensorRT Version 8.4.1 • Issue Type( questions, new requirements, bugs) Question
Is there a way to get the image from the source itself before nvstreammux? Because I’m resizing the stream in nvstreammux and I want the original frame for my application requirement.
The other thing you might have to keep in mind is the async parameter in branched pipelines,
here is a toy example
gst-launch-1.0 -e nvv4l2camerasrc device='/dev/video0' num-buffers=500 ! tee name=t ! queue ! nvvideoconvert ! nvegltransform ! nveglglessink t. ! fakesink async=false
I put a fakesync but you might need file, preview or transmit type sinks.
Cheers,
Ganindu.
P.S.
Also if you are remote’ing your $DISPLAY variable for the environment for the shell you are running should be set to the display that is physically attached to the orin, (otherwise don’t use stuff that use libegl) there are some workarounds if you are running headless or your x-server is started remotely.
You can refer to our open source: deepstream_image_meta_test.c.Just add the probe function to the plugin you wanted to get the image. If your plugin haven’t use NvBufSurface yet, the inmap.data is the raw data of the image.