Nano Yolo sourced from mjpeg stream?

Using this guide:

https://pjreddie.com/darknet/yolo/

I have setup a detection using a webcam via this command:

./darknet detector demo cfg/coco.data cfg/yolov3.cfg yolov3.weights -c0

And it works great.

What I’d like to do is feed the detector from an MJPEG stream such as:

http://10.6.10.180/mjpeg_stream

I’m mostly a newbie, so I found a lot of information on gstreamer, but where I get lost is how to get gstreamer to stream to detector.

Your help is greatly appreciated.

Thanks,

DougM

Hi,

It’s recommended to use our Deepstream SDK.
We support both YOLOv3 and motion JPEG stream, which exactly meet your requirement.
https://developer.nvidia.com/deepstream-sdk

Thanks.

Ok, that worked surprisingly well. I got Deepstream installed on the Xavier and was able to pipe the stream into it by mucking about a bit in the source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_tx1.txt config file and pointing it to an rtsp stream:

[source0]
. . .
type=4
uri=rtsp://10.1.1.69/live_stream
. . .

And it does work - it identifies people in the stream, but it is using its default demo engine:

resnet10.caffemodel_b8_fp16.engine

Can you please point me to a guide or help me re-configure it to use a YOLO model (which consists of a .cfg and a .weights file)?

I found this one:

https://docs.nvidia.com/metropolis/deepstream/4.0/Custom_YOLO_Model_in_the_DeepStream_YOLO_App.pdf

but I’m seriously hoping that YOLO support does not involve recompiling the source :-)

Thank you!!

DougM

Hi,

The only requirement is to update the config file.

Have you check this YOLO sample?
/opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo/

You can create a txt file for your customized model and run it with deepstream-app directly.
Thanks.

I tried running the default app:

dougm@xavier:/opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo$ deepstream-app -c deepstream_app_config_yoloV3.txt

But I got the following error:

Using winsys: x11
Creating LL OSD context new
0:00:00.592572504 10671 0x7f48002240 ERROR nvinfer gstnvinfer.cpp:511:gst_nvinfer_logger:<primary_gie_classifier> NvDsInferContext[UID 1]:initialize(): Could not open custom lib: /opt/nvidia/deepstream/deepstream-4.0/sources/objectDetector_Yolo/nvdsinfer_custom_impl_Yolo/libnvdsinfer_custom_impl_Yolo.so: cannot open shared object file: No such file or directory
. . .

I checked and the .so file is indeed missing.

is this an error in the way I invoked the command or is the file missing?

I did try running make since there was a Makefile in the nvdsinfer_custom_impl_Yolo directory, but it failed with CUDA_VER not set.

UPDATE:
I was able to make by using the following command line:
CUDA_VER=10.0 make

and I was able to get the demo to work. I was also able to get the model to run against my live feed and identify properly, so I’m going to call this a win and Accept as Answer.

As a side note I was not able to load our custom .weights file but that’s for a different thread. The unusual thing I noticed that I want to note here is that to load a custom .cfg and .weights file you have to rename them to the names of the original files - yolov3.cfg and yolov3.weights. Otherwise it will not load.

Presumably these names are hard-coded somewhere. Also you can’t put a path on them in the .txt file - they have to be in the directory you execute from.

Thank you!

DougM