AV1/RTSP Streaming Not Working

Hi,
Please clean the cache and see if it helps:

$ rm .cache/gstreamer-1.0/registry.aarch64.bin

And please try the command:

$ gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! nvv4l2av1enc ! av1parse ! rtpav1pay pt=96 ! udpsink host=127.0.0.1 port=5003 sync=0

Hi,

After clearing the cache and running the command I still get the same error.

$ gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! ‘video/x-raw(memory:NVMM),width=1280,height=720’ ! nvv4l2av1enc ! av1parse ! rtpav1pay pt=96 ! udpsink host=127.0.0.1 port=5003 sync=0
libEGL warning: egl: failed to create dri2 screen
libEGL warning: DRI2: failed to get drm magic

(gst-plugin-scanner:5085): GStreamer-WARNING **: 09:51:17.137: adding type GstEvent multiple times
libEGL warning: egl: failed to create dri2 screen
libEGL warning: DRI2: failed to get drm magic
Setting pipeline to PAUSED …
libv4l2: error getting capabilities: Inappropriate ioctl for device
ERROR: from element /GstPipeline:pipeline0/nvv4l2av1enc:nvv4l2av1enc0: Error getting capabilities for device ‘/dev/v4l2-nvenc’: It isn’t a v4l2 driver. Check if it is a v4l1 driver.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/v4l2_calls.c(107): gst_v4l2_get_capabilities (): /GstPipeline:pipeline0/nvv4l2av1enc:nvv4l2av1enc0:
system error: Inappropriate ioctl for device
ERROR: pipeline doesn’t want to preroll.
ERROR: from element /GstPipeline:pipeline0/nvv4l2av1enc:nvv4l2av1enc0: Could not initialize supporting library.
Additional debug info:
…/gst-libs/gst/video/gstvideoencoder.c(1797): gst_video_encoder_change_state (): /GstPipeline:pipeline0/nvv4l2av1enc:nvv4l2av1enc0:
Failed to open encoder
ERROR: pipeline doesn’t want to preroll.
Failed to set pipeline to PAUSED.
Setting pipeline to NULL …
Freeing pipeline …

Thank you

Hello. The library now works, there was a reading permission problem, it’s my fault, sorry.

But regarding the initial problem I still haven’t solved it.

When the streaming is generated and the client is connected everything works fine with nvv4l2av1enc.

But when the client disconnects and reconnects or connects after the server has started streaming, there is no way for the client to recover the video, it receives video data but cannot build it to display it.

I thought the previous library corrected this problem. It is not like this?
Do I need to add any parameters to nvv4l2av1enc?

Thanks again for the support.

Hi,
Please make sure you use the lib:

$ md5sum libtegrav4l2.so
6dca2f9a0e3c8c8d28854472dbf5fffa  libtegrav4l2.so

If you use this lib in UDP streaming, the client should be able to connect/disconnect anytime.

Hello

Effectively with the new library when server/client is executed:

Server:
gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! ‘video/x-raw(memory:NVMM),width=1280,height=720’ ! nvv4l2av1enc idrinterval=15 ! av1parse ! rtpav1pay ! udpsink host= port=5000 sync=0
client:
gst-launch-1.0 udpsrc port=5000 ! ‘application/x-rtp,encoding-name=AV1,payload=96’ ! rtpav1depay ! decodebin ! autovideoconvert ! queue max-size-bytes=2048 ! autovideosink

It works as mentioned above. If the client disconnects, the video is recovered the next reconnection.

Our infrastructure needs one provider for multiple clients so we use janus-webrtc.
For what would remain

Streaming server → janus-webrtc → web client

Process

  1. We send Janus the source:
    gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! ‘video/x-raw(memory:NVMM),width=1280,height=720’ ! nvv4l2av1enc idrinterval=15 ! av1parse ! rtpav1pay ! udpsink host= port= sync=0

  2. Janus routes it to connected clients.

  3. Clients receive the video.

The problem we have (also with the new library) is that if the client connects to a session that has started streaming, they cannot see the video. However, if you connect to the session before starting the streaming, once it starts you see it without problems.

It seems that what has been arranged to encode and discover through gststreamer with the NVIDIA libraries works well, but when you connect it to a browser it only sends the information (I think sps) in the first frames.

This process with NVIDIA Xavier AGP and VP9 works perfectly and with H264 too. But with ORIN and AV1 it only works if you start the streaming from the beginning.

This architecture works with:

NVIDIA XAVIER AGX H264 → OK
NVIDIA XAVIER AGX VP9 → OK
NVIDIA ORIN NANO H264 → OK
NVIDIA ORIN AGX H264 → OK
NVIDIA ORIN AGX AVI → Only if you start streaming from the beginning.

These are client web negotiations:

v=0
o=- 6398413203353271427 2 IN IP4 127.0.0.1
s=-
t=0 0
a=group:BUNDLE cam00DD
a=extmap-allow-mixed
a=msid-semantic: WMS
m=video 9 UDP/TLS/RTP/SAVPF 96 97
c=IN IP4 0.0.0.0
a=rtcp:9 IN IP4 0.0.0.0
a=ice-ufrag:s+jY
a=ice-pwd:vAAYN3ohcYvPt3p9XE3T9upa
a=ice-options:trickle
a=fingerprint:sha-256 C3:3C:9C:95:01:9B:1C:AE:FA:3F:FA:58:BF:1C:51:74:AA:70:77:3D:53:4D:80:00:63:9E:A6:A2:BA:E1:5D:92
a=setup:active
a=mid:cam00DD
a=extmap:2 docs/native-code/rtp-hdrext/abs-send-time - src - Git at Google
a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid
a=recvonly
a=rtcp-mux
a=rtpmap:96 AV1/90000
a=rtcp-fb:96 goog-remb
a=rtcp-fb:96 transport-cc
a=rtcp-fb:96 ccm fir
a=rtcp-fb:96 nack
a=rtcp-fb:96 nack pli
a=fmtp:96 level-idx=5;profile=0;tier=0
a=rtpmap:97 rtx/90000
a=fmtp:97 apt=96

v=0
o=- 1718184265790030 1 IN IP4 2.139.234.175
s=Mountpoint 2
t=0 0
a=group:BUNDLE cam00DD
a=ice-options:trickle
a=fingerprint:sha-256 31:26:BD:51:4E:E1:19:36:2A:A4:B3:77:F4:D1:58:B5:F1:0B:39:66:32:80:84:14:95:93:57:07:6F:31:02:88
a=extmap-allow-mixed
a=msid-semantic: WMS *
m=video 9 UDP/TLS/RTP/SAVPF 96 97
c=IN IP4 2.139.234.175
a=sendonly
a=mid:cam00DD
a=rtcp-mux
a=ice-ufrag:/kdE
a=ice-pwd:LelXSQ0u+XImX+nVzexhw5
a=ice-options:trickle
a=setup:actpass
a=rtpmap:96 AV1/90000
a=rtcp-fb:96 ccm fir
a=rtcp-fb:96 nack
a=rtcp-fb:96 nack pli
a=rtcp-fb:96 goog-remb
a=rtcp-fb:96 transport-cc
a=extmap:2 docs/native-code/rtp-hdrext/abs-send-time - src - Git at Google
a=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid
a=rtpmap:97 rtx/90000
a=fmtp:97 apt=96
a=ssrc-group:FID 1326238975 2304564628
a=msid:janus januscam00DD
a=ssrc:1326238975 cname:janus
a=ssrc:2304564628 cname:janus
a=candidate:1 1 udp 2015363327 2.139.234.175 24303 typ host
a=candidate:2 1 udp 2015363583 2.139.234.175 28889 typ host
a=end-of-candidates

Any recommendation?

Thanks

Hi,
We don’t have experience about using janus-webrtc. It looks like certain information has to be added to AV1 stream for the frameworks. Please check if you can identify what information is required so that we can add it to AV1 stream.

Hi.

I have carried out tests with the h264 codec and the insert-sps-pps parameter is the one that causes the same problem with this codec.
This chain works correctly. If the web client disconnects and reconnects, the video shows it without problems.

gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay 
valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! nvv4l2h264enc  maxperf-enable=1 disable-cabac=true idrinterval=5  insert-sps-pps=true ! video/x-h264 ! rtph264pay  ! udpsink host=<IP JANUS> port=<PORT JANUS>

If we remove the parameter or insert-sps-pps=false. The same problems appear as with AV1. If the web client is disconnected and reconnected, it receives a stream but the browser does not display the video.

From what I have been able to test (statistics by type) the “codec” information node is only generated when we connect the client and then generate the source on the server.

NOTE: Information obtained from the edge browser debugger for webrtc (edge://webrtc-internals)

codec (mimeType=video/AV1, payloadType=96, id=CITcam00DD1_96_level-idx=5;profile=0;tier=0)
Statistics CITcam00DD1_96_level-idx=5;profile=0;tier=0
timestamp	13/6/2024, 15:24:16
transportId	Tcam00DD1
payloadType	96
mimeType	video/AV1
clockRate	90000
sdpFmtpLine	level-idx=5;profile=0;tier=0

If the source on the server is already broadcasting and we connect the client to the previous one, the “codec” node never appears in the webrtc debug trace

Browser rtp statistics where Client connects after source server begin (no video) :

inbound-rtp (kind=video, mid=cam00DD, ssrc=1055871779, rtxSsrc=3497071643, id=ITcam00DD1V1055871779)
Statistics ITcam00DD1V1055871779
timestamp	14/6/2024, 14:15:04
ssrc	1055871779
kind	video
transportId	Tcam00DD1
jitter	0.004
packetsLost	0
trackIdentifier	januscam00DD
mid	cam00DD
packetsReceived	21178
[packetsReceived/s]	396.897612758397
bytesReceived	26350674
[bytesReceived_in_bits/s]	3950740.8317232314
headerBytesReceived	592984
[headerBytesReceived_in_bits/s]	88905.06525788092
rtxSsrc	3497071643
lastPacketReceivedTimestamp	1718367304593.996
[lastPacketReceivedTimestamp]	14/6/2024, 14:15:04
jitterBufferDelay	0
[jitterBufferDelay/jitterBufferEmittedCount_in_ms]	0
jitterBufferTargetDelay	0
[jitterBufferTargetDelay/jitterBufferEmittedCount_in_ms]	0
jitterBufferMinimumDelay	0
[jitterBufferMinimumDelay/jitterBufferEmittedCount_in_ms]	0
jitterBufferEmittedCount	0
framesReceived	0
[framesReceived/s]	0
[framesReceived-framesDecoded-framesDropped]	0
framesDecoded	0
[framesDecoded/s]	0
keyFramesDecoded	0
[keyFramesDecoded/s]	0
framesDropped	0
totalDecodeTime	0
[totalDecodeTime/framesDecoded_in_ms]	0
totalProcessingDelay	0
[totalProcessingDelay/framesDecoded_in_ms]	0
totalAssemblyTime	0
[totalAssemblyTime/framesAssembledFromMultiplePackets_in_ms]	0
framesAssembledFromMultiplePackets	0
totalInterFrameDelay	0
[totalInterFrameDelay/framesDecoded_in_ms]	0
totalSquaredInterFrameDelay	0
[interFrameDelayStDev_in_ms]	0
pauseCount	0
totalPausesDuration	0
freezeCount	0
totalFreezesDuration	0
firCount	0
pliCount	241
nackCount	0
minPlayoutDelay	0
remoteId	ROA1055871779

Browser rtp statistics where Client connects before activating server source (video ok) :

inbound-rtp (kind=video, mid=cam00DD, ssrc=1650625963, rtxSsrc=1117715192, decoderImplementation=ExternalDecoder (D3D11VideoDecoder), powerEfficientDecoder=true, [codec]=AV1 (96, level-idx=5;profile=0;tier=0), id=ITcam00DD1V1650625963)
Statistics ITcam00DD1V1650625963
timestamp	14/6/2024, 14:21:39
ssrc	1650625963
kind	video
transportId	Tcam00DD1
jitter	0.005
packetsLost	0
trackIdentifier	januscam00DD
mid	cam00DD
packetsReceived	27717
[packetsReceived/s]	392.5681934550618
bytesReceived	34507337
[bytesReceived_in_bits/s]	3919432.9717985354
headerBytesReceived	776076
[headerBytesReceived_in_bits/s]	87935.27533393384
rtxSsrc	1117715192
jitterBufferDelay	34.660939
[jitterBufferDelay/jitterBufferEmittedCount_in_ms]	16.6955333333334
jitterBufferTargetDelay	23.860934999999998
[jitterBufferTargetDelay/jitterBufferEmittedCount_in_ms]	11.452666666666644
jitterBufferMinimumDelay	23.860934999999998
[jitterBufferMinimumDelay/jitterBufferEmittedCount_in_ms]	11.452666666666644
jitterBufferEmittedCount	2059
framesReceived	2059
[framesReceived/s]	30.120321748470214
[framesReceived-framesDecoded-framesDropped]	0
framesDecoded	2059
[framesDecoded/s]	30.120321748470214
keyFramesDecoded	1
[keyFramesDecoded/s]	0
framesDropped	0
totalDecodeTime	7.364809999999999
[totalDecodeTime/framesDecoded_in_ms]	4.729299999999981
totalProcessingDelay	42.593056
[totalProcessingDelay/framesDecoded_in_ms]	21.609166666666606
totalAssemblyTime	4.664156999999999
[totalAssemblyTime/framesAssembledFromMultiplePackets_in_ms]	2.0894666666666653
framesAssembledFromMultiplePackets	2059
totalInterFrameDelay	68.57
[totalInterFrameDelay/framesDecoded_in_ms]	33.39999999999985
totalSquaredInterFrameDelay	2.5700879999999944
[interFrameDelayStDev_in_ms]	15.370100845474685
pauseCount	0
totalPausesDuration	0
freezeCount	0
totalFreezesDuration	0
firCount	0
pliCount	0
nackCount	0
minPlayoutDelay	0
codecId	CITcam00DD1_96_level-idx=5;profile=0;tier=0
[codec]	AV1 (96, level-idx=5;profile=0;tier=0)
lastPacketReceivedTimestamp	1718367699387.669
[lastPacketReceivedTimestamp]	14/6/2024, 14:21:39
frameWidth	1280
frameHeight	720
framesPerSecond	30
decoderImplementation	ExternalDecoder (D3D11VideoDecoder)
powerEfficientDecoder	true
remoteId	ROA1650625963

It seems that Decoded keyFrames are only sent once at the start of the server streamer.

How can I check if the encoder is generating keyframes regularly?

Thanks

Hi,
I have already found the problem.
The rtpav1pay plugin was not sending the info correctly for webrtc. It worked fine for streaming outside webrtc but not for these.

Thank you very much

Hi,
Thanks for sharing the information. Would be great if you can report it to gstreamer community and see if rtpav1pay plugin can be updated to support the use-case in the future.

Hi,

I use the gst-plugin-rs rtp plugin which includes “rtpav1pay”:

Compile and install the latest version and you’re done.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.