Why doesn't this pipeline work? NvMMLiteBlockCreate : Block : BlockType = 279

I’m trying to read in an rtsp h265 stream; infer on it and draw the bounding boxes and then encode it a h265 again for sending to the amazon kinesis sink instead of the screen…

gst-launch-1.0 rtspsrc location=rtsp://111.111.111.111/Streaming/Channels/101 ! rtph265depay ! queue ! h265parse ! nvv4l2decoder ! “video/x-raw(memory:NVMM), format=RGBA” ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path=dstest3_pgie_config.txt batch-size=1 unique-id=1 ! nvmultistreamtiler rows=1 columns=1 width=1280 height=720 ! nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h265enc ! h265parse ! kvssink stream-name=test3

I see this after it sets the status to play:

NvMMLiteOpen : Block : BlockType = 279
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 279

Full output is below:

jason@nano:~/Documents/deepstream_sdk_v4.0.1_jetson/sources/apps/sample_apps/deepstream-test3-with-frame-drop$ gst-launch-1.0 rtspsrc location=rtsp://@111.111.111.111/Streaming/Channels/101 ! rtph265depay ! queue ! h265parse ! nvv4l2decoder ! “video/x-raw(memory:NVMM), format=RGBA” ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path=dstest3_pgie_config.txt batch-size=1 unique-id=1 ! nvmultistreamtiler rows=1 columns=1 width=1280 height=720 ! nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h265enc ! h265parse ! kvssink stream-name=test3
Setting pipeline to PAUSED …
log4cplus:ERROR could not open file ./kvs_log_configuration
INFO - createKinesisVideoClient(): Creating Kinesis Video Client
2019-10-20 18:09:03 [547686619728] INFO - heapInitialize(): Initializing native heap with limit size 134217728, spill ratio 0% and flags 0x00000001
2019-10-20 18:09:03 [547686619728] INFO - heapInitialize(): Creating AIV heap.
2019-10-20 18:09:03 [547686619728] INFO - heapInitialize(): Heap is initialized OK
2019-10-20 18:09:03 [547686619728] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x1, Next state: 0x2
2019-10-20 18:09:03 [547686619728] DEBUG - getSecurityTokenHandler invoked
2019-10-20 18:09:03 [547686619728] DEBUG - Refreshing credentials. Force refreshing: 0 Now time is: 1571566143111593368 Expiration: 0
2019-10-20 18:09:03 [547686619728] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x2, Next state: 0x10
2019-10-20 18:09:03 [547686619728] INFO - createDeviceResultEvent(): Create device result event.
2019-10-20 18:09:03 [547686619728] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x10, Next state: 0x40
2019-10-20 18:09:03 [547686619728] DEBUG - clientReadyHandler invoked
2019-10-20 18:09:03 [547686619728] INFO - try creating stream
2019-10-20 18:09:03 [547686619728] INFO - Creating Kinesis Video Stream test3
2019-10-20 18:09:03 [547686619728] INFO - createKinesisVideoStream(): Creating Kinesis Video Stream.
2019-10-20 18:09:03 [547686619728] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x1, Next state: 0x2
2019-10-20 18:09:03 [547686619728] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x40, Next state: 0x40
2019-10-20 18:09:03 [547686619728] INFO - writeHeaderCallback(): RequestId: 1ddb298f-cb3c-466a-9ff1-e063da765704
2019-10-20 18:09:03 [546538099184] DEBUG - describeStreamCurlHandler(): DescribeStream API response: {“StreamInfo”:{“CreationTime”:1.571565086153E9,“DataRetentionInHours”:2,“DeviceName”:“Kinesis_Video_Device”,“KmsKeyId”:“arn:aws:kms:ap-southeast-2:445823006740:alias/aws/kinesisvideo”,“MediaType”:“video/h265”,“Status”:“ACTIVE”,“StreamARN”:“arn:aws:kinesisvideo:ap-southeast-2:445823006740:stream/test3/1571565086153”,“StreamName”:“test3”,“Version”:“9FyysmPRQyvFdWVNyXHj”}}
2019-10-20 18:09:03 [546538099184] INFO - describeStreamResultEvent(): Describe stream result event.
2019-10-20 18:09:03 [546538099184] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x2, Next state: 0x20
2019-10-20 18:09:03 [546538099184] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x40, Next state: 0x40
2019-10-20 18:09:03 [546538099184] INFO - writeHeaderCallback(): RequestId: 5c1388f1-1bc4-43a3-bad5-d8439ed3b38c
2019-10-20 18:09:03 [546529706480] DEBUG - getStreamingEndpointCurlHandler(): GetStreamingEndpoint API response: {“DataEndpoint”:“https://s-ca658586.kinesisvideo.ap-southeast-2.amazonaws.com”}
2019-10-20 18:09:03 [546529706480] INFO - getStreamingEndpointResultEvent(): Get streaming endpoint result event.
2019-10-20 18:09:04 [546529706480] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x20, Next state: 0x10
2019-10-20 18:09:04 [546529706480] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x40, Next state: 0x40
2019-10-20 18:09:04 [546529706480] DEBUG - getStreamingTokenHandler invoked
2019-10-20 18:09:04 [546529706480] DEBUG - Refreshing credentials. Force refreshing: 1 Now time is: 1571566144060039040 Expiration: 18446744073709551615
2019-10-20 18:09:04 [546529706480] INFO - getStreamingTokenResultEvent(): Get streaming token result event.
2019-10-20 18:09:04 [546529706480] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x10, Next state: 0x40
2019-10-20 18:09:04 [546529706480] DEBUG - defaultStreamReadyCallback(): Reported streamReady callback for stream handle 367776561065
2019-10-20 18:09:04 [546529706480] DEBUG - streamReadyHandler invoked
2019-10-20 18:09:04 [546529706480] Stream is ready
Opening in BLOCKING MODE
Opening in BLOCKING MODE
Creating LL OSD context new
0:00:01.818657152 27371 0x55a12d9ca0 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger: NvDsInferContext[UID 1]:initialize(): Trying to create engine from model files
0:00:01.818879034 27371 0x55a12d9ca0 WARN nvinfer gstnvinfer.cpp:515:gst_nvinfer_logger: NvDsInferContext[UID 1]:generateTRTModel(): INT8 not supported by platform. Trying FP16 mode.
0:01:56.931451888 27371 0x55a12d9ca0 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger: NvDsInferContext[UID 1]:generateTRTModel(): Storing the serialized cuda engine to file at /home/jason/Documents/deepstream_sdk_v4.0.1_jetson/samples/models/Primary_Detector/resnet10.caffemodel_b1_fp16.engine
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://111.111.111.111/Streaming/Channels/101
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
NvMMLiteOpen : Block : BlockType = 279
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 279

(The output above is from kvssink and is correct.)

I cannot get the problem clearly, per the log “log4cplus:ERROR could not open file ./kvs_log_configuration”,
but you said “The output above is from kvssink and is correct.”. what is the issue you mean, could you explain more details.

Yes that log error is normal - that’s what I was trying to say - sorry for the confusion.

Everything in the log output from:

“log4cplus:ERROR could not open file ./kvs_log_configuration”

down to

“2019-10-20 18:09:04 [546529706480] Stream is ready”

Is output from the kvssink element and is normal/correct.

The question is… can you see why this pipeline wouldn’t work:

gst-launch-1.0 rtspsrc location=rtsp://111.111.111.111/Streaming/Channels/101 ! rtph265depay ! queue ! h265parse ! nvv4l2decoder ! “video/x-raw(memory:NVMM), format=RGBA” ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path=dstest3_pgie_config.txt batch-size=1 unique-id=1 ! nvmultistreamtiler rows=1 columns=1 width=1280 height=720 ! nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h265enc ! h265parse ! kvssink stream-name=test3

After the output saying “Setting pipeline to PLAYING” it hangs on this error:

NvMMLiteOpen : Block : BlockType = 279
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 279

I also tried to use nvstreamdemux instead of nvmultistreamtiler but that didn’t help.

Hello jasonpgf2a,

Could you please flash latest BSP ?

Low level MM stack has a structure change for which recompilation is needed.
Flashing latest available BSP package will resolve your issue.

Thanks.

Hi Viranjan

Please excuse my ignorance as I’m pretty new to the jetson platform - but what is a BSP?

Is it just a matter of redoing the getting started guide: https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-devkit ?

Or, the deepstream quick start guide: https://docs.nvidia.com/metropolis/deepstream/dev-guide/index.html ?

Also - are you saying my chain of pipeline commands is correct and should work?

nvv4l2decoder plugin will call some low level lib which is included in the linux system image(BSP), so you may need to update your system image, you can follow https://docs.nvidia.com/metropolis/deepstream/dev-guide/index.html to reflash your device.

You didn’t answer the bit about the pipeline being accurate?

But sounds like I need to reflash and start all over. ;-)

Have you updated the binary - is there a specific version I need to download for flashing onto the card?

The pipeline should work, please use latest BSP image.

Well I’ve followed your instructions and reflashed my jetson nano with the latest. Installed deepstream etc etc and tested the above pipeline - and I get exactly the same error. Output below.

Command:

gst-launch-1.0 rtspsrc location=rtsp://xxx:yyy@111.111.111.111/Streaming/Channels/101 ! rtph265depay ! queue ! h265parse ! nvv4l2decoder ! “video/x-raw(memory:NVMM), format=RGBA” ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path=dstest3_pgie_config.txt batch-size=1 unique-id=1 ! nvmultistreamtiler rows=1 columns=1 width=1280 height=720 ! nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h265enc ! h265parse ! kvssink stream-name=test3

Note that kvssink works perfectly well when I just use standard gstreamer elements and nothing nvidia specific.

*Please don’t ask me to reflash again ;-)

Log output - note I have striked through the text output from kvssink:

jason@nano:~/Development/deepstream_sdk_v4.0.1_jetson/sources/apps/sample_apps/deepstream-test3$ gst-launch-1.0 rtspsrc location=rtsp://xxx:yyy@111.111.111.111/Streaming/Channels/101 ! rtph265depay ! queue ! h265parse ! nvv4l2decoder ! “video/x-raw(memory:NVMM), format=RGBA” ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path=dstest3_pgie_config.txt batch-size=1 unique-id=1 ! nvmultistreamtiler rows=1 columns=1 width=1280 height=720 ! nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h265enc ! h265parse ! kvssink stream-name=test3
Setting pipeline to PAUSED …
log4cplus:ERROR could not open file ./kvs_log_configuration
INFO - createKinesisVideoClient(): Creating Kinesis Video Client
2019-10-23 19:52:40 [548163783248] INFO - heapInitialize(): Initializing native heap with limit size 134217728, spill ratio 0% and flags 0x00000001
2019-10-23 19:52:40 [548163783248] INFO - heapInitialize(): Creating AIV heap.
2019-10-23 19:52:40 [548163783248] INFO - heapInitialize(): Heap is initialized OK
2019-10-23 19:52:40 [548163783248] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x1, Next state: 0x2
2019-10-23 19:52:40 [548163783248] DEBUG - getSecurityTokenHandler invoked
2019-10-23 19:52:40 [548163783248] DEBUG - Refreshing credentials. Force refreshing: 0 Now time is: 1571831560552916996 Expiration: 0
2019-10-23 19:52:40 [548163783248] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x2, Next state: 0x10
2019-10-23 19:52:40 [548163783248] INFO - createDeviceResultEvent(): Create device result event.
2019-10-23 19:52:40 [548163783248] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x10, Next state: 0x40
2019-10-23 19:52:40 [548163783248] DEBUG - clientReadyHandler invoked
2019-10-23 19:52:40 [548163783248] INFO - try creating stream
2019-10-23 19:52:40 [548163783248] INFO - Creating Kinesis Video Stream test3
2019-10-23 19:52:40 [548163783248] INFO - createKinesisVideoStream(): Creating Kinesis Video Stream.
2019-10-23 19:52:40 [548163783248] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x1, Next state: 0x2
2019-10-23 19:52:40 [548163783248] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x40, Next state: 0x40
2019-10-23 19:52:40 [548163783248] INFO - writeHeaderCallback(): RequestId: 9e89b64e-ed31-454e-a446-66b7f0ae4fa9
2019-10-23 19:52:40 [547895980528] DEBUG - describeStreamCurlHandler(): DescribeStream API response: {“StreamInfo”:{“CreationTime”:1.571565086153E9,“DataRetentionInHours”:2,“DeviceName”:“Kinesis_Video_Device”,“KmsKeyId”:“arn:aws:kms:ap-southeast-2:445823006740:alias/aws/kinesisvideo”,“MediaType”:“video/h265”,“Status”:“ACTIVE”,“StreamARN”:“arn:aws:kinesisvideo:ap-southeast-2:445823006740:stream/test3/1571565086153”,“StreamName”:“test3”,“Version”:“9FyysmPRQyvFdWVNyXHj”}}
2019-10-23 19:52:40 [547895980528] INFO - describeStreamResultEvent(): Describe stream result event.
2019-10-23 19:52:41 [547895980528] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x2, Next state: 0x20
2019-10-23 19:52:41 [547895980528] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x40, Next state: 0x40
2019-10-23 19:52:41 [547895980528] INFO - writeHeaderCallback(): RequestId: 6158a0b9-1260-400f-aa4b-0542cfdf1395
2019-10-23 19:52:41 [547887587824] DEBUG - getStreamingEndpointCurlHandler(): GetStreamingEndpoint API response: {“DataEndpoint”:“https://s-ca658586.kinesisvideo.ap-southeast-2.amazonaws.com”}
2019-10-23 19:52:41 [547887587824] INFO - getStreamingEndpointResultEvent(): Get streaming endpoint result event.
2019-10-23 19:52:41 [547887587824] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x20, Next state: 0x10
2019-10-23 19:52:41 [547887587824] DEBUG - stepStateMachine(): PIC Client State Machine - Current state: 0x40, Next state: 0x40
2019-10-23 19:52:41 [547887587824] DEBUG - getStreamingTokenHandler invoked
2019-10-23 19:52:41 [547887587824] DEBUG - Refreshing credentials. Force refreshing: 1 Now time is: 1571831561440291236 Expiration: 18446744073709551615
2019-10-23 19:52:41 [547887587824] INFO - getStreamingTokenResultEvent(): Get streaming token result event.
2019-10-23 19:52:41 [547887587824] DEBUG - stepStateMachine(): PIC Stream State Machine - Current state: 0x10, Next state: 0x40
2019-10-23 19:52:41 [547887587824] DEBUG - defaultStreamReadyCallback(): Reported streamReady callback for stream handle 368124956121
2019-10-23 19:52:41 [547887587824] DEBUG - streamReadyHandler invoked
2019-10-23 19:52:41 [547887587824] Stream is ready

Opening in BLOCKING MODE
Opening in BLOCKING MODE
Creating LL OSD context new
0:00:03.592677835 28521 0x55b5f1b4a0 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger: NvDsInferContext[UID 1]:initialize(): Trying to create engine from model files
0:00:03.592993416 28521 0x55b5f1b4a0 WARN nvinfer gstnvinfer.cpp:515:gst_nvinfer_logger: NvDsInferContext[UID 1]:generateTRTModel(): INT8 not supported by platform. Trying FP16 mode.
0:01:56.345540196 28521 0x55b5f1b4a0 INFO nvinfer gstnvinfer.cpp:519:gst_nvinfer_logger: NvDsInferContext[UID 1]:generateTRTModel(): Storing the serialized cuda engine to file at /home/jason/Development/deepstream_sdk_v4.0.1_jetson/samples/models/Primary_Detector/resnet10.caffemodel_b1_fp16.engine
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://xxx:yyy@111.111.111.111/Streaming/Channels/101
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
NvMMLiteOpen : Block : BlockType = 279
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 279

Please ignore thise log out put line as it is from the kvssink element and meaningless:
“[s]log4cplus:ERROR could not open file ./kvs_log_configuration”

I can switch out kvssink to fakesink and get the exact same error as well which shows that its nothing to do with kvssink…

Following pipeline tested OK on Xavier using latest BSP image.

gst-launch-1.0 rtspsrc location= rtsp://10.24.217.30:8554/ ! rtph265depay ! h265parse ! nvv4l2decoder ! m.sink_0 nvstreammux name=m batch-size=1 width=1280 height=720 ! nvinfer config-file-path= config_infer_primary.txt ! nvmultistreamtiler rows=1 columns=1 width=1280 height=720 ! nvvideoconvert ! nvdsosd ! nvvideoconvert ! nvv4l2h265enc ! h265parse ! filesink location= file.h265

Above pipeline is similar to your pipeline except that decoder’s output should not be “video/x-raw(memory:NVMM), format=RGBA”
Decoder always outputs NV12

How can I tell I am on the latest BSP…? I just goto the jetson nano getting started document and follow the instructions. Download from:

Your pipeline that works on the Xavier does not work on the Nano. Same error:

Creating LL OSD context new
Redistribute latency…
NvMMLiteOpen : Block : BlockType = 8
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 8

See previous question - how can I tell what the latest BSP (whats that stand for?) is? The download site only has one available for nano so I assume that is the latest one.

I think there is a low level error with the encoding library on the nano. I’ve tried it with h264 as well and get the same issue.

Really frustrating.

Some system info:

$ cat /proc/version
Linux version 4.9.140-tegra (buildbrain@mobile-u64-911) (gcc version 7.3.1 20180425 [linaro-7.3-2018.05 revision d29120a424ecfbc167ef90065c0eeb7f91977701] (Linaro GCC 7.3-2018.05) ) #1 SMP PREEMPT Tue Jul 16 17:04:49 PDT 2019

Is there anything else I can provide - specific debug logs maybe?

Have you tried this pipeline on Xavier, BTW, is it possible to share more logs, set GST_DEBUG=3.

for BSP version, you can use below command:
cat /etc/nv_tegra_release

BTW what deepstream version you are running

If DS is 4.0.1
then use BSP
sdkmanager --server https://sdkm-a.nvidia.com/sdkm/server/builds/JetPack/326J/sdkm-config/main/sdkml1_repo.json
or
JetPack 4.2.2 Final Build (b21):

Latest deepstream: 4.0.1

cat /etc/nv_tegra_release : /etc/nv_tegra_release does not exist on the nano?!?

I don’t have a Xavier to test on unfortinately. For my project I really need this working on the Nano.

With GST_DEBEG=3 logs: see attached as too big (and I only captured for about 3 sec).

gst_log.txt (233 KB)

How can I tell what jet pack version I currently have? I downloaded the image from https://www.developer.nvidia.com/jetson-nano-sd-card-image-r3221

Thats the current link from the Jetson Nano Getting Started website.

Are you saying that if I reflash my nano’s SD card with this one it will fix the issue:
https://sdkm-a.nvidia.com/builds/SDKManager/JetPack_SDKs/4.2.2/L4T/21_19235_27098813/JETPACK_422_b21/

?

Also - I cannot access the links you’ve posted - are they on a public server? I can’t seem to reach sdkm-a.nvidia.com.

ps. I’ve noticed that if I swap out nvv4l2encoder with x265enc I don’t get the strange block error instead is says “redistributing latency” and then hangs…

The link you provided doesn’t work externally. From the Jetson downloads website I can see this as the latest version:

https://developer.nvidia.com/jetson-nano-sd-card-image-r3221

How can I tell if that contains jetpack 422 b21 ?

Would be good to know if anyone is looking into this issue or whether I should change to some other way of doing things?

I managed to get your sample pipeline to work on the nano which saves to a file BUT only after I applied the library fixes provided here:

https://devtalk.nvidia.com/default/topic/1058086/deepstream-sdk/how-to-run-rtp-camera-in-deepstream-on-nano/3

And this is with the latest bsp available from the jetson downloads site.

So now I can save as a file but still unable to use kvssink. And of course the aws support people say it’s an nvidia issue. ;-)

What I have also tried is:

Run deepstream-app and set sink to rtsp.
The in a second terminal run another pipeline which reads the rtsp from deepstream-app - rtph264depay - video/x-h264,format=avc,alignment=au - h264parse - kvssink stream-name=test

This kinesis pipeline is a sample from the aws kinesis website.
So with this double pipeline running it works.
Why doesn’t my original pipeline work?