"NvMediaParserParse Unsupported Codec" error when use deepstream-app

Dear all,

I use Jetson Xavier platform.
Camera : AR0231 (120 degree)
OS : JatPack 4.4
TensorRT 7.1.3
CUDA 10.2

The raw data of camera (120 degree) have distortion.
The distortion need to remove.
The data sequence : MP4 file(camera data) → remove distortion → RTSP server → deepstream-app
But I have “NvMediaParserParse Unsupported Codec” error when use deepstream-app.

I refer
https://forums.developer.nvidia.com/t/rtsp-server-with-gpu-accelerating-pushing-opencv-mats/149105

https://forums.developer.nvidia.com/t/how-to-perform-fish-eye-lens-distortion-correction-in-gstreamer-pipeline-hfov-150/82808/24

I wrote two programs.
I attach relational files in the mail.
Following are my test steps. You can duplicate the error after follow these steps.

Step 1:
build test_v2
g++ test_v2.cpp -o test_v2 pkg-config --cflags --libs opencv4 gstreamer-1.0 gstreamer-rtsp-server-1.0

build rtspServer_v2
gcc rtspServer_v2.c -o rstpServer_v2 $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)

Step 2:
run test_v2

2.1 read MP4 file (fov120_distortion_1min.mp4, MP4 file is the data of camera)
2.2 remove distortion
2.3 send data to UDP port 5000

Step 3:
run rtspServer_v2

3.1 read data from UDP port 5000
3.2 send data out with RTSP server

Step 4:
view RTSP data by smplayer
smplayer rtsp://127.0.0.1:8554/test

It work when use smplayer.

Step 5:
deepstream-app read data from RTSP server.

./deepstream-app -c deepstream_app_config_yoloV3_camera2_test.txt

It run fail.
Following are the log.

nvidia@nvidia-desktop:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo$ ./deepstream-app -c deepstream_app_config_yoloV3_camera2_test.txt
Unknown or legacy key specified ‘is-classifier’ for group [property]
Warn: ‘threshold’ parameter has been deprecated. Use ‘pre-cluster-threshold’ instead.

Using winsys: x11
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_mot_klt.so
gstnvtracker: Optional NvMOT_RemoveStreams not implemented
gstnvtracker: Batch processing is OFF
gstnvtracker: Past frame output is OFF
Deserialize yoloLayerV3 plugin: yolo_83
Deserialize yoloLayerV3 plugin: yolo_95
Deserialize yoloLayerV3 plugin: yolo_107
0:00:03.330646253 20059 0x5576c3fb50 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1701> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/model_b1_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 4
0 INPUT kFLOAT data 3x608x608
1 OUTPUT kFLOAT yolo_83 255x19x19
2 OUTPUT kFLOAT yolo_95 255x38x38
3 OUTPUT kFLOAT yolo_107 255x76x76

0:00:03.330935547 20059 0x5576c3fb50 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/model_b1_gpu0_int8.engine
0:00:03.335012126 20059 0x5576c3fb50 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/config_infer_primary_yoloV3.txt sucessfully

Runtime commands:
h: Print this help
q: Quit

p: Pause
r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:181>: Pipeline ready

PERF: FPS 0 (Avg)
PERF: 0.00 (0.00)
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261

NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
NVMEDIA: NVMEDIABufferProcessing: 1504: NvMediaParserParse Unsupported Codec
[NvMediaParserParse:357] Video parser parse failed: 0**PERF: 0.00 (0.00)

Qestion:
The smplayer run fine.
But deepstream-app run fail.
It have “NvMediaParserParse Unsupported Codec” error.

Could you give me suggestion?

Best regards

-Jason

test_v2.cpp (4.0 KB)
rtspServer_v2.c (1.9 KB)
deepstream_app_config_yoloV3_camera2_test.txt (3.7 KB)

input MP4 file for test : fov120_distortion_1min.mp4

1 Like

can you try with

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
#type=1
type=3
uri=rtsp://127.0.0.1:8554/test
#camera-width=1280
#camera-height=720
#camera-fps-n=30
#camera-fps-n=22
#camera-fps-d=1
#camera-csi-sensor-id=1
#camera-v4l2-dev-node=2
num-sources=1
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

Hi,
Generally the error is printed when h164/h265 stream is invalid. You may try gst-launch-1.0 command:

$ gst-launch-1.0 uridecodebin ! nvoverlaysink

Check if the stream is playable.

Hi PhongNT,

Thank you about your reply.
I got the same error after change source0 config.
Following are the log.

nvidia@nvidia-desktop:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo$ ./deepstream-app -c deepstream_app_config_yoloV3_camera_test.txt
Unknown or legacy key specified ‘is-classifier’ for group [property]
Warn: ‘threshold’ parameter has been deprecated. Use ‘pre-cluster-threshold’ instead.

Using winsys: x11
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_mot_klt.so
gstnvtracker: Optional NvMOT_RemoveStreams not implemented
gstnvtracker: Batch processing is OFF
gstnvtracker: Past frame output is OFF
Deserialize yoloLayerV3 plugin: yolo_83
Deserialize yoloLayerV3 plugin: yolo_95
Deserialize yoloLayerV3 plugin: yolo_107
0:00:03.134223167 17789 0x5573518040 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1701> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/model_b1_gpu0_int8.engine
INFO: [Implicit Engine Info]: layers num: 4
0 INPUT kFLOAT data 3x608x608
1 OUTPUT kFLOAT yolo_83 255x19x19
2 OUTPUT kFLOAT yolo_95 255x38x38
3 OUTPUT kFLOAT yolo_107 255x76x76

0:00:03.134439564 17789 0x5573518040 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/model_b1_gpu0_int8.engine
0:00:03.138640775 17789 0x5573518040 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo/config_infer_primary_yoloV3.txt sucessfully
cb_sourcesetup set 100 latency

Runtime commands:
h: Print this help
q: Quit

p: Pause
r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:181>: Pipeline ready

** INFO: <bus_callback:167>: Pipeline running

PERF: FPS 0 (Avg)
PERF: 0.00 (0.00)
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
NVMEDIA: NVMEDIABufferProcessing: 1504: NvMediaParserParse Unsupported Codec
[NvMediaParserParse:357] Video parser parse failed: 0**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)

Hi DaneLLL,

Thank you about your reply.
I following your suggestion.
I used the command.
GST_DEBUG=3 gst-launch-1.0 uridecodebin uri=rtsp://127.0.0.1:8554/test ! nvoverlaysink

I got “NvMediaParserParse Unsupported Codec”.
Do you have any other suggestion?

Following are the log.
nvidia@nvidia-desktop:/opt/nvidia/deepstream/deepstream-5.0/sources/objectDetector_Yolo$ GST_DEBUG=3 gst-launch-1.0 uridecodebin uri=rtsp://127.0.0.1:8554/test ! nvoverlaysink
0:00:00.070528690 17927 0x55697facc0 WARN omx gstomx.c:2826:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /home/nvidia/.config:/etc/xdg/xdg-unity:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
0:00:00.206425838 17927 0x7f88053370 FIXME default gstutils.c:3981:gst_pad_create_stream_id_internal:fakesrc0:src Creating random stream-id, consider implementing a deterministic way of creating a stream-id
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
0:00:00.213555528 17927 0x7f88053140 FIXME rtpjitterbuffer gstrtpjitterbuffer.c:1535:gst_jitter_buffer_sink_parse_caps: Unsupported timestamp reference clock
0:00:00.213608907 17927 0x7f88053140 FIXME rtpjitterbuffer gstrtpjitterbuffer.c:1543:gst_jitter_buffer_sink_parse_caps: Unsupported media clock
Opening in BLOCKING MODE
0:00:06.004437222 17927 0x7f6c0040f0 WARN v4l2 gstv4l2object.c:4435:gst_v4l2_object_probe_caps:nvv4l2decoder0:src Failed to probe pixel aspect ratio with VIDIOC_CROPCAP: Unknown error -1
0:00:06.004493865 17927 0x7f6c0040f0 WARN v4l2 gstv4l2object.c:2372:gst_v4l2_object_add_interlace_mode:0x7f5c0b9440 Failed to determine interlace mode
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
NVMEDIA: NVMEDIABufferProcessing: 1504: NvMediaParserParse Unsupported Codec
0:00:06.542277918 17927 0x7f88053140 WARN rtpjitterbuffer rtpjitterbuffer.c:570:calculate_skew: delta - skew: 0:00:01.000837710 too big, reset skew
0:00:13.290172807 17927 0x7f88053140 WARN rtpjitterbuffer rtpjitterbuffer.c:570:calculate_skew: delta - skew: 0:00:01.001362676 too big, reset skew
0:00:19.635695525 17927 0x7f88053140 WARN rtpjitterbuffer rtpjitterbuffer.c:570:calculate_skew: delta - skew: 0:00:01.000626242 too big, reset skew
0:00:26.182852333 17927 0x7f88053140 WARN rtpjitterbuffer rtpjitterbuffer.c:570:calculate_skew: delta - skew: 0:00:01.000515418 too big, reset skew
^C[NvMediaParserParse:357] Video parser parse failed: 0handling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:00:26.678695699
Setting pipeline to PAUSED …
Setting pipeline to READY …
0:00:32.826468039 17927 0x7f88053140 WARN rtpjitterbuffer rtpjitterbuffer.c:570:calculate_skew: delta - skew: 0:00:01.005488741 too big, reset skew

-Jason

1 Like

I think codec of this rstp link not support. are you try with mp4 file or another rstp stream from a camera.

Hi PhongNT,

Thank you about your reply.

I did two test case about test_v2.cpp

case 1 : data from MP4 file in test_v2.cpp
cv::VideoCapture cap(“/home/nvidia/deepstream-distortion/fov120_distortion_1min.mp4”);

It got “NvMediaParserParse Unsupported Codec” when run deepstream-app.

But smplayer run OK.
command : smplayer rtsp://127.0.0.1:8554/test

case 2 : data from camera in test_v2.cpp
const char* gst = "v4l2src device=/dev/video2 ! videoconvert ! appsink ";
VideoCapture cap(gst);

It got “NvMediaParserParse Unsupported Codec” when run deepstream-app.

But smplayer run OK.
command : smplayer rtsp://127.0.0.1:8554/test

Both case the deepstream-app run fail.
But smplayer run OK.
It is strange.

Do you have any other suggestion to fix the problem?

Best regards

-Jason

Hi,
There is sample code in

/opt/nvidia/deepstream/deepstream-5.1/sources/apps/apps-common/src/deepstream_sink_bin.c

In test_v2, please try the pipeline:

VideoWriter out("appsrc ! videoconvert ! video/x-raw,format=I420 !x264enc ! h264parse config-interval=1 ! rtph264pay ! udpsink host=224.224.255.255 port=5400 sync=0 async=0 ",CAP_GSTREAMER,0,fps,size,true);

Please try with gst-launch-1.0 command to make sure the stream is playable, and then try deepstream-app.

Hi DaneLLL,

Thank you about your reply.

I tried about your suggestion.

In test_v2 use the pipeline
VideoWriter out(“appsrc ! videoconvert ! video/x-raw,format=I420 ! x264enc tune=zerolatency speed-preset=superfast ! h264parse config-interval=1 ! rtph264pay ! udpsink host=127.0.0.1 port=5000 sync=0 async=0”,CAP_GSTREAMER,0,fps,size,true);

Although the gst-launch-1.0 fail.
But the new pipeline work for deepstream-app.

Best regards

-Jason

Hi,
Good to know it works. The x264enc plugin is software encoder. You may consider to utilize hardware encoder like

appsrc ! videoconvert ! video/x-raw,format=I420 ! nvvideoconvert ! nvv4l2h264enc insert-sps-pps=1 idrinterval=30 ! ...

Hi DaneLLL,

Thank you about your reply.

I tried nvv4l2h264enc in test_v2.

VideoWriter out(“appsrc ! videoconvert ! video/x-raw,format=I420 ! nvvideoconvert ! nvv4l2h264enc insert-sps-pps=1 ! h264parse config-interval=1 ! rtph264pay ! udpsink host=127.0.0.1 port=5000 sync=0 async=0”,CAP_GSTREAMER,0,fps,size,true);

It work good for deepstream-app.

Best regards

-Jason

1 Like