We have tried to disable the auto exposure and manually specify an exposure time using nvcamerasrc, but without any apparent effect. This is documented and should really work, so I am wondering why we cannot make it work.
Is this mode of operation sensor specific? Can someone share a minimal working pipeline using nvcamerasrc?
I also came across this problem with nvcamerasrc and played a bit with argus but in some experiments I found that calls in the form of “iNativeBuffer->copyToNvBuffer(buffer_fd);” are rather slow on our TX1 device but this seems to be the only way to get workable pixel buffers.
Also: we would like to work with gstreamer and perhaps the NVIDIA DeepStream SDK so it would be convenient to have a working gstreamer source.
Currently can launch a gstreamer chain from nvcamerasrc and we override whatever settings is makes regarding exposure-time and gain using direct v4l2 comments but of course this is rather hacky and we would prefer a clean solution.
Do you think that you can either implement auto-exposure, exposure-time, gains, etc. in nvcamerasrc or perhaps release the source code of this component such that we can implement it ourselves?
So this application just secretly uses nvcamerasrc and will also not have working exposure-time control… It seems kind of a waste of time to be implementing our own v4l2-based gstreamer src trying to optimally get this to work nicely in NVMM zero-copy memory while NVIDIA has this nvcamerasrc which actually works well but is just not finished.
OK it works in argus and in some NVIDIA repositories I find a mention to something called nvargussrc but this one also does not seem to support exposure control and appears to be black-box and totally closed source.
Again in the context of this DeepStream SDK that NVIDIA is promoting I imagine that control over gain and exposure time is important in many real-time video processing applications so it would be nice to have better support for this from the provided sdks.
I had already successfully used the Argus functions:
virtual Status Argus::ISourceSettings::setExposureTimeRange(const Range<uint64_t> &exposureTimeRange)
virtual Status Argus::ISourceSettings::setGainRange(const Range<float>& gainRange)
So these work but this does not solve the problem of the topic starter or myself because we need DeepStream/gstreamer pipelines. I guess I can try to figure out how to make a gstreamer source out of this Argus application but you at NVIDIA already have this and have probably implemented a lot of nice optimizations using zero-copy memory, buffer re-use, etc.
Also: I have asked this same question to some people from the DeepStream team at GTC Europe this year and they seemed to never even have heard about Argus.