The deepstream-server use rest api after adding streaming I find the deepstream-server not to infer

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson AGX
• DeepStream Version 7.0
**• JetPack Version (valid for Jetson only) 6.0 GA
**• TensorRT Version 8.6
**• NVIDIA GPU Driver Version (valid for GPU only) 540.3.0
**• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

The deepstream-server use rest api after adding streaming I find the deepstream-server not to infer
the picture is below
批注 2024-07-04 134715
The config file is below
multiurisrcbin:
#comma separated uri (no semicolon at the end). For ex- uri1,uri2,uriN
#parsed inside nvmultiurisrcbin
#uri-list: rtsp://admin:a12345678@192.168.31.9/Streaming/Channels/1
uri-list: “”
port: 9000
live-source: 1
width: 1920
height: 1080
batched-push-timeout: 33333
max-batch-size: 1
#To simulate nvvideoconvert specific REST APIs change disable-passthrough: 1
disable-passthrough: 1
drop-pipeline-eos: 1
enable rtsp reconnections properties for rtsp source(s).
#rtsp-reconnect-interval: 5
#rtsp-reconnect-attempts: 3
#To demonstrate the new nvstreammux config-file usage
config-file-path: config_new_nvstreammux.txt’

But I config uri-list with rtsp://admin:a12345678@192.168.31.9/Streaming/Channels/1
The programe can work

I also find if the uri-list in configfile is not empty(Preload a video file)
,then I can use rest api adding streams.

I don’t want to Preload a video file and hope to use rest api adding streams

How to fix this problem

Could you try to set the async=false of the sink plugin and check that?

Sorry I don’t understand,Can you explain specifically, please?

The change is like below? no work

Could you try to verify that with gst-launch-1.0 command first?
Terminal 1:

gst-launch-1.0 nvmultiurisrcbin port=9000 ip-address=localhost batched-push-timeout=33333 max-batch-size=10 drop-pipeline-eos=1  rtsp-reconnect-interval=1  live-source=1 width=1920 height=1080 ! nvmultistreamtiler ! fakesink async=false

Terminal 2(change the rtsp field to a real source):

curl -XPOST 'http://localhost:9000/api/v1/tream/add' -d '{
    "key": "sensor",
    "value": {
        "camera_id": "uniqueSensorID1",
        "camera_name": "front_door",
        "camera_url": "rtsp://xxx",
        "change": "camera_add",
        "metadata": {
            "resolution": "1920 x1080",
            "codec": "h264",
            "framerate": 30
        }
    },
    "headers": {
        "source": "vst",
        "created_at": "2021-06-01T14:34:13.417Z"
    }
  }'

The result is below
image

The fromat of your rtsp source may not be supported by the hardware decoder, or the I-frame interval may be too large. Could you configure some parameters of your rtsp source to change the I-frame interval?

Does this test have help this problem? with ffmpeg , this rtsp url is opennned normally. And the peoblem I met
is diffent that using your test method

I want to this problem that the deepstream-server use rest api after adding streaming I find the deepstream-server not to infer.

I also check the rstp source

because I mingt have no rignt to change camrea’s propty, can you fix this problem from deepstream code or config file

This is for the issue below.

About this issue, I tried that on my Jetson Orin with our deepstream-server demo. It is inferred normally when I add the source. Are you using our demo test?

Yes! !

OK. Have you changed any code in our demo or set some environment variables on your board? Have you tried to add our source from the README?

  curl -XPOST 'http://localhost:9000/api/v1/stream/add' -d '{
    "key": "sensor",
    "value": {
        "camera_id": "uniqueSensorID1",
        "camera_name": "front_door",
        "camera_url": "file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4",
        "change": "camera_add",
        "metadata": {
            "resolution": "1920 x1080",
            "codec": "h264",
            "framerate": 30
        }
    },
    "headers": {
        "source": "vst",
        "created_at": "2021-06-01T14:34:13.417Z"
    }
  }'

Yes I tried this yesterday , It can work, because this is local file

Could you refer to this FAQ Build rtsp server to build a rtsp source with our stream to eliminate the probelm of the rtsp source?

I try .
Did you saw this reply yesterday

How to fix this problem?

Did it work properly with the rtsp source you created yourself?

Yes. I have attached the method to verify this before. You can try that with the gst-launch-1.0 command I attached before.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.