Hi, thanks.
I’ve tried to use h264 parse but unfortunately it is even worse - I get a lot of “lost frames detected: count = 1”.
Furthermore, when tryin to read the rtsp from FFMPEG and VLC I get:
errors in P frame
MV errors in I frame
error while decoding
Hi sorry.
I thought I solved it, but I only managed to read the stream right on ffmpeg, but on vlc or gst it still does it.
Unfortunately I don’t have a screen connected, it is a server. Is there anything else I can check?
Hi, I couldn’t manage to read the yuv files.
But when I used this pipe:
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-h264, width=1920, height=1080, framerate=15/1 ! filesink location=output1.mp4
Again I manage to see very good video on my computer using ffmpeg, but the vlc and gst still doesn’t work.
I wonder what can it be?
Looking at the link you’ve provided in first post, it seems the native mode for 1080p is @30 fps.
You may further check with v4l2-ctl (provided by apt package v4l-utils):
v4l2-ctl -d1 --list-formats-ext
I’d suggest to first try with 30 fps.
You may also embed your h264 stream into a container, such as mp4 for file or rtph264 for streaming.
Saving to file :
Hi! thanks for the answer!
I’ve tried the recording, and when I try to play it I get this error:
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: This file contains no playable streams.
Additional debug info:
qtdemux.c(701): gst_qtdemux_post_no_playable_stream_error (): /GstPipeline:pipeline0/GstQTDemux:qtdemux0:
no known streams found
When I try to stream like suggested I get a stream only with ffplay - but with a lot more smudge and green screens., and on the streaming side I get:
lost frames detected: count = 1
When I remove the h264parse I get the same output as before - ffplay works good, vlc works but very smudgy with grey screen and publishing gst to kvs doesn’t work as well.
I use the same script and the same camera on UpBoard and everything works good, I was wondering maybe the cameras get any kind of init params that need to be changed before using them, but I can’t tell the difference.
Hi,
Please set up a RTSP server with videotestsrc on Jetson Nano and check if this works. You may refer to Jetson Nano FAQ Q: Is there any example of running RTSP streaming?
Sorry, I missed the EOS flag that may be required for this case. Edited my previous post for correcting that.
Another possible cause might be that test-launch mainly supports baseline profile for H264. Does it work if you decode and re-encode with baseline profile ?
Thanks a lot!
The re-encoding actually works!
Even if I change the profile to be “main” it works,
what does it mean? that the the encoding on camera doesn’t work good?
I was not correct. This was omxh264enc related.
I think that for each profile, not any level may be expected, and maybe your camera does’t use an expected level…can you configure the camera H264 encoding ?
I have no such camera and no Nano, but I fail to reproduce your case from NX running R32.5.1 and simulating H264 camera with v4l2loopback. In my case I can RTSP stream without decoding/encoding, omxh264enc can also acheive high framerate, and in each case reading the stream (thru localhost), pausing, stopping restarting with VLC all work (only see increasing latency, might be VLC related), so I just wonder if you have the correct codec data for your cam.
You may contact cam vendor support for better support about this.