When I run the pipeline, I get an error about unsupported colorimetry.
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
ERROR: from element /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0: Device '/dev/nvhost-msenc' does not support 1:4:0:0 colorimetry
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-v4l2/gst-v4l2/gstv4l2object.c(4108): gst_v4l2_object_set_format_full (): /GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0:
Device wants 1:4:5:1 colorimetry
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
Redistribute latency...
If I specify the colorimetry I get a negotiation error.
gst-launch-1.0 v4l2src device=/dev/video2 io-mode=2 ! "image/jpeg, width=1920, height=1080, framerate=60/1" ! nvjpegdec ! "video/x-raw(memory:NVMM), format=I420, colorimetry=1:4:5:1" ! nvv4l2h264enc ! h264parse ! mpegtsmux ! filesink location=a.ts sync=false -e
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
ERROR: from element /GstPipeline:pipeline0/GstNvJpegDec:nvjpegdec0: No valid frames decoded before end of stream
Additional debug info:
gstvideodecoder.c(1161): gst_video_decoder_sink_event_default (): /GstPipeline:pipeline0/GstNvJpegDec:nvjpegdec0:
no valid frames found
ERROR: from element /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0: Could not create handler for stream
Additional debug info:
mpegtsmux.c(996): mpegtsmux_create_streams (): /GstPipeline:pipeline0/MpegTsMux:mpegtsmux0
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Interrupt while waiting for EOS - stopping pipeline...
Execution ended after 0:00:15.311068402
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
It seems to me that I need a mechanism to change the colorimetry but I am not sure where to look to find out how to do this. Any help would be appreciated.
The command that you provided works - even without running jetson_clocks.
I am running jetson_stats [1] at the same time and notice that the hardware decoder is not running and CPU utilisation is âhigherâ. I was under the impression that nvjpegdec would utilise the hardware decoder resulting in very low CPU utilisation. Am I mistaken? Is there a better pipeline that I could be using to take full advantage of the hardware capabilities of the Jetson Nano?
Thank you for all your help so far. I greatly appreciate it.
Hi,
The optimal pipeline is to pass video/x-raw(memory:NVMM) buffers from head to tail in the pipeline. However, nvjpedec is not designed for continuous decoding, so it is run as ânvjpegdec ! video/x-rawâ. An optimization is to run ânvjpegdec ! video/x-raw(memory:NVMM)â and you may apply the patch and rebuild/replace libgstnvjpeg.so.
The optimal pipeline is to pass video/x-raw(memory:NVMM) buffers from head to tail in the pipeline. However, nvjpedec is not designed for continuous decoding, so it is run as ânvjpegdec ! video/x-rawâ.
Is it fair to say that, without applying the patch, there is no way to get this optimal pipeline while also using hardware JPEG decoding with Gstreamer on the Nano? Given that MJPEG is the norm for most USB cameras in meaningful resolutions, that really makes it very hard to say that USB cameras are usable at all in practical terms.
As to the patch, I just went through the whole process with a fresh install of JetPack 4.3 on a Nano dev kit (A02). I used the L4T 32.3.1 public sources and the oft-mentioned patch (nvjpegdec slower then jpegdec in gstreamer - #25 by DaneLLL). The patch broke the build, so I had to diagnose and get the nvbuf_utils header and lib included. When I then deployed as instructed, the pipeline that had worked previously then gives âBus error (core dumped)â.
Iâm not sure why, if this can be patched, that NVIDIA cannot just provide this as a working binary. I really didnât need to waste hours finding and building all of that, just so I could get basic usability out of a USB camera, and STILL not actually get it working. Short of providing a binary, can you at least make it easy to find the code and provide working build instructions?
Hi,
I am author of the patch and has proposed it in internal review. Teams had other opinions and the patch got rejected. Since gst-jpeg is open source and should be fine to let users download the source code and follow build instruction.
Besides, users can consider to use tegra_multimedia_api. The 12_camera_v4l2_cuda sampke has demonstration of MJPEG decoding.
Thanks @DaneLLL, understood. Unfortunately, as I said above, I tried following the build instructions, found that I had to improvise some changes that were not documented, and when running it got a core dump. Can you help me understand where I went wrong? Is there a revised build configuration to go along with the patch?
For the record, Iâm working with gstjpegdec primarily because our product is based on Deepstream, and therefore Gstreamer. So 12_camera_v4l2_cuda doesnât help us directly. Iâm sorry they kept your patch out of the release - add my voice to wanting this. Otherwise, NVIDIA honestly should say that Deepstream is only for use with MIPI cameras and not USB cameras.
Is there any disadvantage to using âenable-max-performance=1â on the nvv4l2decoder, or any circumstance where you specifically would not use it?
Using âjtopâ from the âjetson-statsâ package, I donât see the hardware decoder marked as âRUNNINGâ although performance seems like it must be. Is that detected differently than for h.264/265/etc? Just curiousâŠ
Also, please pass along to the doc team that there should be a reference to MJPEG decoding with ânvv4l2decoderâ in the âAccelerated GStreamer User Guideâ. Right now it only mentions other video formats there, and unless you run gst-inspect on the element you would have no idea that it handles MJPEG at all (and even with gst-inspect itâs not clear how to construct a command for a real case). If I had found MJPEG or MJPG when I searched through the Accelerated Gstreamer docs, it would have been very helpful.
Hi,
MJPEG decoding in nvv4l2decoder is specific to using DeepStream SDK. It is not fully compatible with plugins listed in Accelerated Gstreamer User Guide. We are working on unifying the interfaces. Will update documents accordingly once it is well implemented and verified.
For single decoding process, you can keep it disabled for efficient power consumption. If you run simultaneous decoding processes, suggest enble it for maximum decoding performance.
It is individual hardware engine called NVJPG. sudo tegrastats shows clock of NVDEC, and does not show NVJPG.
It looks ok to me, although if it is off ( clk_enable_count is 0), then frequency has no meaning and in such case tegrastats doesnât show frequency for NVDEC or NVENC.
If 1, the frequency read from clk_rate is in Hz, as you seemed to have converted.
Note that you can do the same for NVENC and NVDEC and check you get same (wellâŠsay similar) results as tegrastats.