Hi everyone
i have used the following pipeline in MediaPlayer object of QML. it works fine and it puts the video ouput in newly created window.
Rectangle
{
MediaPlayer
{
id: cameraVideo
autoPlay: true
source: "gst-pipeline: uridecodebin3
uri=\"rtspt://192.168.1.118:8080/h264_ulaw.sdp\" ! queue !
nvstreammux0.sink_0 nvstreammux
name=nvstreammux0 batch-size=1 batched-push-timeout=40000
width=800 height=600 live-source=TRUE
! queue ! nvvideoconvert ! queue !
nvinfer config-file-path=\"/opt/nvidia/deepstream/deepstream/
samples/configs/deepstream-app/
config_infer_primary_nano.txt\" ! queue ! nvmultistreamtiler ! queue !
nvtracker tracker-width=240 tracker-height=200
ll-lib-file=/opt/nvidia/deepstream/deepstream/lib/libnvds_mot_iou.so
ll-config-file=/opt/nvidia/deepstream/deepstream/samples
/configs/deepstream-app/iou_config.txt ! queue !
nvdsosd process-mode=HW_MODE ! queue
! nvegltransform ! nveglglessink"
}
VideoOutput
{
anchors.fill: parent
source: cameraVideo
}
}
but i want to put video output in specified QML Rectangle which is parent of MediaPlayer object. it seems that i must use qtvideosink
instead of nveglglessink
. but which plugins should i use to transform the video stream for qtvideosink
.
by the way the app is running on NVidia Jetson Nano.