Thu Jun 1 12:07:22 2023
±----------------------------------------------------------------------------+
| NVIDIA-SMI 520.56.06 Driver Version: 520.56.06 CUDA Version: 11.8 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce … Off | 00000000:01:00.0 Off | N/A |
| N/A 39C P8 N/A / N/A | 73MiB / 6144MiB | 0% Default |
| | | N/A |
±------------------------------±---------------------±---------------------+
±----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 2550 G /usr/lib/xorg/Xorg 44MiB |
| 0 N/A N/A 9891 C+G …926892584207833979,262144 26MiB |
±----------------------------------------------------------------------------+
Deepstream version-6.2
We tried deepstreamsr and was able to save the inference output!. however can combine deepstreamtest5 and deepstreamsr.
The deepstreamsr runs with the below command line statement
$./deepstream-testsr-app rtsp://127.0.0.1/video1 --enc-type=1 --sink-type=1 --bbox-enable=1 --sr-mode=0
How to extend this for multiple RTSP URL is it possible to add the enc-type=1 --sink-type=1 --bbox-enable=1 --sr-mode=0 to the config file?