"-bash: video-viewer: command not found"

hey everyone,

I’m trying to use RTP to stream video using a csi camera from my nano 2gb to my host pc but i’m getting this error:

“-bash: video-viewer: command not found”

Do I need to install something on my host first? The tutorial had begun to work fine while on the nano itself but I want to ensure that I can work headlessly if I want to. Thanks :)

1 Like

Hi @nbarrow85, are you running that command on your PC or on the Jetson? It should be run on the Jetson, either inside the jetson-inference container or after you have built/installed the jetson-inference repo from source.

The command you run on your PC to view the RTP stream is different, see here for those:

Also, if you are streaming your camera out over RTP, currently I recommend doing this outside of container (i.e. by building from source) because there is a video quality issue with RTP from inside the container.

Hey man,

Sorry, I’m very new still to some of the workflow…

So, should I have already been running the stream from inside the nano before trying to rtp from my host laptop?

No worries, that’s correct - on the Nano, first run video-viewer (with your camera and the RTP output arguments). Then on your PC, run GStreamer or VLC to view the stream - I recommend GStreamer, because it works more reliably with RTP and has lower latency.

okay. Totally awesome. I’ll give this a whirl. Thanks!

Hey Dusty,

So I launched the container on the nano and input:

‘video-viewer csi://0 rtp://<MY HOST PC’S IP>:1234’

from within the container. The terminal began “capturing” frames and a video window popped up. I then unplugged my hdmi cord from my nano and plugged that into my PC. I opened a terminal window and ssh’d into my nano-no problem there. I then input:

" video-viewer csi://0 rtp://:1234"

and I got this error:

“-bash: video-viewer: command not found”

Should I have contacted the nano’s IP address? If the stream is running on the nano and all I did was swap out the cable I’m confused why I wouldn’t be able to see the same frames being captured now over ssh on my host machine. Obviously I’m doing something wrong still?

I also tried to input the gstreamer launch command from the documentation in a different terminal window that I hadn’t ssh’d into. I got this output:

"gst-launch-1.0 -v udpsrc port=1234 \

caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96” !
rtph264depay ! decodebin ! videoconvert ! autovideosink
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96
Setting pipeline to PLAYING …
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1000a6742402895a014016e4001000468ce3c80, level=(string)4, profile=(string)constrained-baseline
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1000a6742402895a014016e4001000468ce3c80, level=(string)4, profile=(string)constrained-baseline
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1000a6742402895a014016e4001000468ce3c80, level=(string)4, profile=(string)constrained-baseline
Missing element: H.264 (Constrained Baseline Profile) decoder
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
gstdecodebin2.c(4679): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
no suitable plugins found:
Missing decoder: H.264 (Constrained Baseline Profile) (video/x-h264, stream-format=(string)avc, alignment=(string)au, codec_data=(buffer)01424028ffe1000a6742402895a014016e4001000468ce3c80, level=(string)4, profile=(string)constrained-baseline)

Execution ended after 0:00:01.074374442
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …
"
Are the “missing decoder” lines relevant? Are these two separate problems or are they linked? Thanks again for your patience…

You had already run the video-viewer while your HDMI cable was still connected, so you needn’t run it again via SSH. But typically I would run it via SSH (and not via the Nano’s desktop/HDMI) so you can still keep an eye on it from your PC. It sounds like you may not have actually been SSH’d in to your Nano if it couldn’t find video-viewer, since it did in fact find it when you ran it from the Nano’s desktop.

It looks like you need to install the libav (ffmpeg) plugin for GStreamer on your PC - can you try this:

$ sudo apt-get install gstreamer1.0-libav

hey Dusty,

Thanks for getting back to me. I’ll try just initiating video viewer from my host PC through ssh. Quick question with that operation though-after I’ve ssh’d into my nano do I then need to run video viewer from the docker container or can I just slug that command into terminal?

Normally you would run it from the docker container, yes. However there is currently an issue with the RTP encoding quality inside container. So I recommend that you eventually build the jetson-inference project from source outside of container. However to get it working initially, you can just use the container.

1 Like

okay. Awesome. I’m going to try this all now…

Beautiful…

downloaded that gstreamer code you pointed me to and ssh’d into my nano, ran the docker container and did everything by-the-book. Up and running!

Feels great. Thanks again!