No outputs in console/display while running sample python/c example apps

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.0
• JetPack Version (valid for Jetson only)
• TensorRT Version 8.1
• NVIDIA GPU Driver Version (valid for GPU only) 470.82
• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) test1-test4

[property]
gpu-id=0
net-scale-factor=0.0039215697906911373
model-file=/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel
proto-file=/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.prototxt
model-engine-file=/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
labelfile-path=/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/labels.txt
int8-calib-file=/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/cal_trt.bin
force-implicit-batch-dim=1
batch-size=1
network-mode=1
num-detected-classes=4
interval=0
gie-unique-id=1
output-blob-names=conv2d_bbox;conv2d_cov/Sigmoid
#scaling-filter=0
#scaling-compute-hw=0

[streammux]
gpu-id=0
batch-size=1
batched-push-timeout=40000
width=368
height=640

[class-attrs-all]
pre-cluster-threshold=0.2
eps=0.2
group-threshold=1

[sink0]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File
type=1
sync=0
source-id=0
gpu-id=0

[sink1]
enable=1
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265 3=mpeg4

codec=1
sync=0
#iframeinterval=10
bitrate=2000000
output-file=out.mp4
source-id=0

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description) installed the docker version deepstream-6.0 still same issue, deepstream-app -c dstest1_pgie_config.txt file_path runs successfully but not outputs.

Runtime commands:
h: Print this help
q: Quit

p: Pause
r: Resume

** INFO: <bus_callback:194>: Pipeline ready

** INFO: <bus_callback:180>: Pipeline running

This is for save output to file. if you want to see the output via display, you need to use type 2. which GPU you are using?

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2

but i have mentioned fakesink in here [sink0]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File
type=1
sync=0
source-id=0
gpu-id=0

i am using rtx 3080 and why i am not able to see outputs in console

and also tested with type=2 but not outputs

If you use Fakesink, you can not see outputs in console, it’s expected. fakesink will throw away the buffer it received.

any error you met when use Eglsink?

if i use Eglsink i did not got any errors and also not output/display, is test1 python app suppose to display outputs or inference results

Yes.
Where did you run? from terminal or from desktop?

i am using WS from that in terminal i am running the this `python3.8 deepstream-test1.py file_path’

You can not see the visualized output from terminal, you can use type 4 rtspstreaming, and use other tool like vlc to view the output.

can you provide a example for that. i have tried all test apps but none gave me the output display screen, if i ran using deepstream-app -c config_path it will give output display.

[sink2]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
#encoder type 0=Hardware 1=Software
enc-type=0
sync=0
bitrate=4000000
#H264 Profile - 0=Baseline 2=Main 4=High
#H265 Profile - 0=Main 1=Main10
profile=0

set below properties in case of RTSPStreaming

rtsp-port=8554
udp-port=5400

you need to use other tool to see the output like vlc. when you run app, you will see the rtsp streaming address. input it in vlc tab “open network stream”, then you can see the output.

solved: i was giving .mp4 as input, then changed to sample h264 video

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.