Please provide complete information as applicable to your setup.
(P.S.: I am using DeepStream in Google Colab, and its working fine)
• Hardware Platform (Jetson / GPU): GPU • DeepStream Version: 6.0 • JetPack Version (valid for Jetson only) • TensorRT Version:8.0.1 • NVIDIA GPU Driver Version (valid for GPU only): 495.44
I want to ask two things:
I want to output RTSP from Deepstream. So I used type=4 (RTSP) in sink property. But instead of getting an IP address, I am getting a localhost link. I want to play the output over the internet, for which I need an IP address. How to get the IP address?
I am using IP webcam in my android mobile for getting live video feed from the mobile camera. It gave me a link like: “http://:8080”. How can I use it as a source in the config file. (I know that I need to use type=2 and uri=ip-address in the source property, but its giving error.)
After running the deepstream-app, I am getting the following messages:
(gst-plugin-scanner:3575): GStreamer-WARNING **: 05:28:24.056: Failed to load plugin ‘/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_inferserver.so’: libtritonserver.so: cannot open shared object file: No such file or directory
(gst-plugin-scanner:3575): GStreamer-WARNING **: 05:28:24.064: Failed to load plugin ‘/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_udp.so’: librivermax.so.0: cannot open shared object file: No such file or directory
*** DeepStream: Launched RTSP Streaming at rtsp://localhost:8554/ds-test ***
ERROR: …/nvdsinfer/nvdsinfer_model_builder.cpp:1484 Deserialize engine failed because file path: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine open error
0:00:02.601310866 3574 0x562d383332a0 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1889> [UID = 1]: deserialize engine from file :/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine failed
0:00:02.606681707 3574 0x562d383332a0 WARN nvinfer gstnvinfer.cpp:635:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Warning from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1996> [UID = 1]: deserialize backend context from engine from file :/opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b30_gpu0_int8.engine failed, try rebuild
0:00:02.606706257 3574 0x562d383332a0 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
… … … (so on)
I am saying that the third line gives a link to the localhost (the line starting and ending with three *'s). I want to play the stream over the internet, but I can’t use that link to view the stream over the internet. So I am asking about how can I do that?
If you want to play the stream in different device, you need to replace the “localhost” to the ip of the device which is running the deepstream app. If you want to play the stream in the same device, “localhost” is OK.
I have written above that I am using DeepStream in google colab and it has nothing like network settings. It is just a cloud platform for running codes. I got the ip address by the following command:
!curl ipecho.net/plain
It returned me an ip address and I used that. Please suggest what to do.