How to view Jetson nano live video from Android OS device

Hi
I am using following code to stream a live video from camera to an Android device via wifi connection

gst-launch-1.0 -v v4l2src device=/dev/video0 \
! "video/x-raw, format=(string)UYVY, width=(int)2592, height=(int)1944,framerate=24/1" \
! nvvidconv \
! "video/x-raw(memory:NVMM),format=(string)I420" \
! nvv4l2h264enc maxperf-enable=true insert-vui=true insert-sps-pps=1 bitrate=10000000 \
! h264parse \
! rtph264pay \
! udpsink clients=192.168.18.18:38298 sync=false

Set Jetson nano WIFI as an access point. The Android device connected to Jetson nano with a static IP.

I installed RaspberryPi camera viewer into my Android device as my test viewer and set following gstreamer code into the app:

gst-launch-1.0 udpsrc port=38298 ! application/x-rtp, media=video, clock-rate=90000, encoding-name=(straing)H264, playload=96 ! rtph264depay ! queue ! avdec_h264 ! fpsdisplaysink sync=false

I can view the video from the app, but the frame rate is only 7 fps (average). When moving the object in front of the camera, the video quality is very bad.

The same live video from Jetson nano and the same display script on Window 10 PC, it can achieve 24 fps and good video quality.

The Android device (wifi connected to a router) can play 4k video from Youtube smoothly, but not the wifi direct connection from Jetson nano live video.
What’s wrong with it? How to resolve this issue?
Any suggestions?
Thank you very much.

CX

Hi,
In UDP streaming, we can achieve target framerate and neglectable latency in running this test case:
Gstreamer TCPserversink 2-3 seconds latency - #5 by DaneLLL
Please give it a try and check if you can achieve 30 fps.

In your gstreamer command, if you have confirmed the source can steadily output in 24fps, the bottleneck can be copying CPU buffer to NVMM buffer in nvvidconv plugin. You may use nvv4l2camerasrc plugin which captures frame data into NVMM buffer directly, eliminating the buffer copy.

Thank you for your kind reply.
I did a test on your suggestion with minor modification as below:

gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=1280,height=720' ! tee name=t ! nvv4l2h264enc insert-sps-pps=1 ! h264parse ! rtph264pay ! udpsink clients=192.168.18.18:38298 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0

It achieved 30 fps in average, but the current frame rate varies from 25 to 35 fps on my Android phone

when I changed the frame size from 1280 x 720 to 2592 x 1944 as the one I expected. The code is as below:

gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=2592,height=1944 ! timeoverlay valignment=4 halignment=1 ! nvvidconv ! 'video/x-raw(memory:NVMM),width=2592,height=1944' ! tee name=t ! nvv4l2h264enc insert-sps-pps=1 ! h264parse ! rtph264pay ! udpsink clients=192.168.18.18:38298 sync=0 t. ! queue ! nvegltransform ! nveglglessink sync=0

It can only achieve 13.5fps in average abd varies from 10 to 15 fps on my Andriod phone

Hi DaneLLL
I tried to use nvv4l2camerasrc, but it can’t produce a proper video image on frame size at 2592 x 1944.
This is my test code:

gst-launch-1.0 nvv4l2camerasrc device=/dev/video0   ! 'video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2592, height=(int)1944, framerate=(fraction)24/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink -e

This is the incorrect video frame

When change the frame size to a smaller size, the nvv4l2camerasrc works fine.
The is the code

gst-launch-1.0 nvv4l2camerasrc device=/dev/video0   ! 'video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2560, height=(int)1440, framerate=(fraction)32/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink -e

This is the captured correct frame:

My camera is e-con system See3CAM_CU55 USB 3.0 camera.
Is this issue on nvv4l2camerasrc?
Thanks

Hi,
If you use nvv4l2camerasrc and run in 2560x1440, can you achieve 30fps in the Android phone?

NvBuffer is hardware DMA buffer and has alignment in some resolutions. Looks like 2592x1944 is not a working resolution. You can only use v4l2src for the resolution.

Jetson Nano has constraint in CPU capability. If the phone can decode h264 stream to achieve 2560x1440p30, the bottleneck is probably in CPU.

Hi
This is my tested code

gst-launch-1.0 -v nvv4l2camerasrc device=/dev/video0 \
        ! "video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2560, height=(int)1440,framerate=32/1" \
! nvvidconv \
! "video/x-raw(memory:NVMM),format=(string)I420" \
! nvv4l2h264enc maxperf-enable=true insert-vui=true insert-sps-pps=1 bitrate=10000000 \
! h264parse \
! rtph264pay \
! udpsink clients=192.168.18.18:38298 sync=false

It can only achieve 14 fps on my Android device.

Hi,
You may try lower bitrate such as 4Mbps. See if it can achieve 30fps. Then the bottleneck can be in network bandwidth.

If it is still 14fps, it is very likely the phone cannot achieve 2560x1440p30 decoding.

Hi
I tried bit rate at 1mbps. My SMART Phone can only achieve 17fps. I tried on Samsung tablet Galaxy Tab A SM-T580. It can only achieve 7fps (even worse).
If it is the Android device WIFI adaptor issue, but both devices have no issue on play 4k youtube video.
If it is Jetson nano WIFI adaptor (Intel Dual band wireless AC-8265) issue, why my laptop can achieve the frame rate as I expected?
It has only Android device connected to Jetson nano through WIFI at a time. The distance is less than 1m. It looks like it is not a network bandwidth issue as well.
Any suggestion? Thanks

CX

Hi,
You can achieve 720p30 decoding if you run 720p30 source on Jetson Nano:

For 2560x1440 it is only 17fps. Could you check if the source fps is 32fps:

gst-launch-1.0 -v nvv4l2camerasrc device=/dev/video0 \
        ! "video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2560, height=(int)1440,framerate=32/1" \
! nvvidconv \
! "video/x-raw(memory:NVMM),format=(string)NV12" \
! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0

And encoder can achieve 32fps:

gst-launch-1.0 -v nvv4l2camerasrc device=/dev/video0 \
        ! "video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2560, height=(int)1440,framerate=32/1" \
! nvvidconv \
! "video/x-raw(memory:NVMM),format=(string)NV12" \
! nvv4l2h264enc maxperf-enable=true insert-vui=true insert-sps-pps=1 bitrate=10000000 \
! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0

This my test output

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvV4l2CameraSrc:nvv4l2camerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 18, dropped: 0, current: 35.17, average: 35.17
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 34, dropped: 0, current: 31.97, average: 33.59
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 51, dropped: 0, current: 32.38, average: 33.17
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 67, dropped: 0, current: 31.99, average: 32.88
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 84, dropped: 0, current: 31.49, average: 32.59
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 100, dropped: 0, current: 31.99, average: 32.49
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 117, dropped: 0, current: 32.00, average: 32.42
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:04.302056297
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

This is my test output

ion@ion-desktop:~$ gst-launch-1.0 -v nvv4l2camerasrc device=/dev/video0 \
>         ! "video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2560, height=(int)1440,framerate=32/1" \
> ! nvvidconv \
> ! "video/x-raw(memory:NVMM),format=(string)NV12" \
> ! nvv4l2h264enc maxperf-enable=true insert-vui=true insert-sps-pps=1 bitrate=10000000 \
> ! fpsdisplaysink text-overlay=0 video-sink=fakesink sync=0
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstNvV4l2CameraSrc:nvv4l2camerasrc0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)NULL, level=(string)NULL, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)32/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)NULL, level=(string)NULL, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)32/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)NULL, level=(string)NULL, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)32/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0.GstGhostPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)NULL, level=(string)NULL, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)32/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
/GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, interlace-mode=(string)progressive, framerate=(fraction)32/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)2560, height=(int)1440, format=(string)UYVY, interlace-mode=(string)progressive, framerate=(fraction)32/1
H264: Profile = 66, Level = 0 
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstFakeSink:fakesink0: sync = false
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 20, dropped: 0, current: 39.53, average: 39.53
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 37, dropped: 0, current: 32.00, average: 35.67
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 54, dropped: 0, current: 32.23, average: 34.51
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 71, dropped: 0, current: 32.27, average: 33.95
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 88, dropped: 0, current: 31.50, average: 33.44
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 105, dropped: 0, current: 32.03, average: 33.21
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0: last-message = rendered: 121, dropped: 0, current: 31.84, average: 33.02
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:04.602392529
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

In both cases achieved 32fps.
What else can I do?

You may try increasing buffer-size property of udpsink:

... ! udpsink clients=192.168.18.18:38298 buffer-size=33554432 sync=false

Thank you for your suggestion. I tested. Nothing has been improved.
I think the issue may be the Android app RaspberryPi camera viewer?

Hi,
If you have one more Jetson Nano(or either Jetson platform), as a comparison, you may stream to the Jetson Nano by running the commands:

$ export DISPLAY=:0
$ gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! h264parse ! nvv4l2decoder ! fpsdisplaysink text-overlay=0 video-sink=nv3dsink sync=0 -v

See if you can achieve target fps on Jetson Nano.

I will test it when another Jetson nano is available

I got another Jetson nano development board. I flashed SD card image of jetson-nano-jp46-sd-card-image: sd-blob-b01.img. and tried your script. it failed.
This is the gstreamer output message

ion@ion-desktop:~$ gst-launch-1.0 udpsrc port=38298 ! "application/x-rtp, encode-name=H264,payload=96" ! rtph264depay ! h264parse ! nvv4l2decoder  fpsdisplaysink text-overlay=0 video-sink=nv3dsink sync=0 -v
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE 
Pipeline is live and does not need PREROLL ...
/GstPipeline:pipeline0/GstFPSDisplaySink:fpsdisplaysink0/GstNv3dSink:nv3dsink0: sync = false
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encode-name=(string)H264, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, encode-name=(string)H264, payload=(int)96, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)2592, height=(int)1944, framerate=(fraction)24/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)constrained-baseline, level=(string)5

(gst-launch-1.0:13705): GStreamer-CRITICAL **: 10:15:28.162: gst_mini_object_unref: assertion 'mini_object != NULL' failed
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, width=(int)2592, height=(int)1944, framerate=(fraction)24/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, profile=(string)constrained-baseline, level=(string)5
/GstPipeline:pipeline0/nvv4l2decoder:nvv4l2decoder0.GstPad:src: caps = video/x-raw(memory:NVMM), format=(string)NV12, width=(int)2592, height=(int)1944, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)24/1
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:01:30.854476081
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
ion@ion-desktop:~$ 

what can i do for it?
Thanks

Hi,
Do you run

... ! nvv4l2decoder  fpsdisplaysink text-overlay=0 video-sink=nv3dsink sync=0 -v

or

... ! nvv4l2decoder ! fpsdisplaysink text-overlay=0 video-sink=nv3dsink sync=0 -v

Not sure why but it misses one ! in the command.

Also change encode-name into encoding-name.

Thanks.
It can display the video on another Jetson nano. It is better than display on my SMART phone with Android OS. But it is still NOT as good as it on my laptop. It has more frequently frame dropped and frame data corrupted.
Is possible the power supply to the display video Jetson nano via micro USB port (2A)?
I am going to test again after got another power supply to Jetson nano via J25.