DeepStream, Python USB cam sample hangs

OK, this does also not work, at least not in full flavour. I see the inference engine starting. The input is paused, but in never starts after then.

But having a short circuit pipe, which just captures and displays, works surprisingly:

gst-launch-1.0 v4l2src device=/dev/video0 ! "image/jpeg,width=1280,height=720,framerate=30/1" ! jpegdec ! videoconvert ! xvimagesink

Please advice, what else I could try to make it run with the inference engine

Thanks

It can work in my board. Please check with your own platform.

Hmm. What do you mean?

I can not debug for you because I can not reproduce the failure in my board.

OK, I have something running, great.

gst-launch-1.0 v4l2src device=/dev/video0 ! "image/jpeg,width=640,height=480,framerate=30/1" ! jpegdec ! videoconvert ! nvvideoconvert ! "video/x-raw(memory:NVMM),fromat=NV12" ! m.sink_0 nvstreammux name=m batch-size=1 width=640 height=480 ! nvinfer config-file-path=/opt/nvidia/deepstream/deepstream-5.0/samples/configs/deepstream-app/config_infer_primary.txt ! nvegltransform bufapi-version=true ! nveglglessink qos=false async=false sync=false

There is a window on screen now and it contains live video. I changed the output as you see (makes sense?) and I also set the network-mode by default to 2 in the config in order to speed up the start (I suppose the Nano can’t do INT8, right?)

Since there is (understandably) no inference result overhead display, is there something (e.g. a debug enabler or so), which would allow me to see, if the inference works?

Otherwise I will try to manipulate the sample code to work with this pipe.

Thanks so much for your help, that looks good.

@foreverneilyoung This is just a simplified pipeline for debugging camera, so the nvdsosd is not added. Please add the necessary plugins according to deepstream_test_1_usb.py. Just take sometime to compare the pipeline with the code.

There is no problem with the camera and deepstream sample code. Please read the introduction of deepstream samples and deepstream plugins to understand deepstream pipeline. Python Sample Apps and Bindings Source Details — DeepStream 6.1.1 Release documentation and GStreamer Plugin Overview — DeepStream 6.1.1 Release documentation

This is deepstream forum, let’s focus on DeepStream here.

Sure, but no input, no DeepStream. One depends on the other.

Thanks

I have it. Took the changed caps_v4l2src, created a jpegdec pipeline element and fiddled that between caps_v4l2src and vidconvsrc.

Works. I think this way is even better from the standpoint of USB bus load…

Thanks for the inspiration and ideas :) Great job, well done!

Have a nice day

@Fiona.Chen Hi, would you please mind to help me again? I’m now having a set of working cameras, but trying to extend my base.

I’m having at least two cameras here (Logitech and ELP), which are working fine with the Nano and this pipe:

gst-launch-1.0 v4l2src device=/dev/video0 ! "image/jpeg,width=640,height=480,framerate=30/1" ! jpegdec ! videoconvert ! autovideosink

However, if I change it that way, that it goes via the Nvidia components, it mostly fails with “data stream error”

gst-launch-1.0 v4l2src device=/dev/video0 ! "image/jpeg,width=640,height=480,framerate=30/1" ! jpegdec ! videoconvert ! nvvideoconvert ! "video/x-raw(memory:NVMM)" ! m.sink_0 nvstreammux name=m batch-size=1 width=640 height=480 ! nvegltransform bufapi-version=true ! nveglglessink qos=false async=false sync=false

What chances to I have to investigate this issue?

GST_DEBUG=4 delivers this

Setting pipeline to PAUSED ...

Using winsys: x11 
Pipeline is live and does not need PREROLL ...
0:00:00.176799378 12859   0x559b556190 FIXME           videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out:<jpegdec0> Sub-class should implement drain()
0:00:00.176945267 12859   0x559ba84c00 WARN               structure gststructure.c:1832:priv_gst_structure_append_to_gstring: No value transform to serialize field 'display' of type 'GstEGLDisplay'
Got context from element 'eglglessink0': gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Setting pipeline to PLAYING ...
0:00:00.178079354 12859   0x559ba84c00 WARN                 v4l2src gstv4l2src.c:692:gst_v4l2src_query:<v4l2src0> Can't give latency since framerate isn't fixated !
New clock: GstSystemClock
0:00:00.182249550 12859   0x559b556190 WARN          v4l2bufferpool gstv4l2bufferpool.c:790:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:00.421474475 12859   0x559b556190 FIXME           videodecoder gstvideodecoder.c:933:gst_video_decoder_drain_out:<jpegdec0> Sub-class should implement drain()
0:00:00.425660140 12859   0x559b556140 ERROR            egladaption gstegladaptation.c:659:gst_egl_adaptation_choose_config:<eglglessink0> Could not find matching framebuffer config
0:00:00.425713631 12859   0x559b556140 ERROR            egladaption gstegladaptation.c:672:gst_egl_adaptation_choose_config:<eglglessink0> Couldn't choose an usable config
0:00:00.425732433 12859   0x559b556140 ERROR          nveglglessink gsteglglessink.c:2707:gst_eglglessink_configure_caps:<eglglessink0> Couldn't choose EGL config
0:00:00.425748059 12859   0x559b556140 ERROR          nveglglessink gsteglglessink.c:2767:gst_eglglessink_configure_caps:<eglglessink0> Configuring caps failed
0:00:00.425847853 12859   0x559b556190 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.425965512 12859   0x559b556190 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.426013273 12859   0x559b556190 WARN                GST_PADS gstpad.c:4226:gst_pad_peer_query:<nvegltransform0:src> could not send sticky events
0:00:00.427376584 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.431781421 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.431857361 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.431901632 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.460383460 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.460452264 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.460490598 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.460692686 12859   0x7f74003850 ERROR         nvegltransform gstnvegltransform.c:343:gst_nvegl_transform_transform:<nvegltransform0> Something is wrong, EGLImage is expected.
0:00:00.466112649 12859   0x559b556190 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<v4l2src0> error: Internal data stream error.
0:00:00.466147025 12859   0x559b556190 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<v4l2src0> error: streaming stopped, reason error (-5)
0:00:00.466248277 12859   0x7f74003850 ERROR         nvegltransform gstnvegltransform.c:343:gst_nvegl_transform_transform:<nvegltransform0> Something is wrong, EGLImage is expected.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason error (-5)
Execution ended after 0:00:00.288088973
Setting pipeline to PAUSED ...
0:00:00.466460313 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.466518231 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.466573232 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
0:00:00.466637504 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
Setting pipeline to READY ...
0:00:00.466771726 12859   0x7f74003850 ERROR          nveglglessink gsteglglessink.c:2812:gst_eglglessink_setcaps:<eglglessink0> Failed to configure caps
Setting pipeline to NULL ...
Freeing pipeline ...

Forgot to mention, but I also tried with “! video/x-raw(memory:NVMM), format=NV12” after nvvideoconvert to no avail

Do you know your camera format inside mjpeg? Actually mjpeg supports many color formats, but nvvideoconvert may not support some of the formats. You need to confirm with your vendor of the format first.

It is a Logitech C920. Do you think this is such an uncommon device? Is there a way to determine this?

Does this tell you something?

Test results on a Logitech C920

Baseline Test Parameters

  • OBS set to recording mode
  • USB 3.1 Gen 2 via Unitek 4-port USB Hub
  • OBS Base 1920x1080
  • OBS Output 1920x1080
  • C920 Webcam settings
  • Video Format: MJPEG
  • YUV Color Space 709
  • YUV Color Range Full
  • Buffering Auto Detect
  • Audio Output Mode Capture audio only

Any comment?

video format and color space are not YUV format. Please refer to your camera vensor for the information.

Ah, you can forget about. Next day, next chance. Out of the sudden int work.

Thanks

And btw: I bet the guys at Logitech would LMAO if I would explain to them their cams don’t deliver YUV