TX1 ISP Queries

Hi All,

I have few queries on TX1 ISP usage.

  1. We will be using one YUV sensor and 3 Bayer sensors. So We thought of using V4L2 framework for YUV sensor and ISP framework for Bayer sensors. But it seems selection of V4L2 or ISP framework is compile time configuration. Please Let us know how to select configuration as per our use case.

  2. We came to know currently Nvidia has Software support for only one camera interface.This limitation is for only ISP framework or for V4L2 framework also?

  3. Can Nvidia expose ISP input/output APIs ?

Thanks
Palani

Hi,

Can Someone reply for Qn:1 and Qn:2, it is very important from design perspective. Without this answer, We cannot proceed ahead.

Thanks
Palani

  1. In the coming bsp release, we can support both NVIDIA ISP framework and standard V4l2 framework in parallel, thus there is no need to do compile time configuration at that moment.
  2. We will update the multi-camera support later.
  3. We will provide low-level APIs based on NVIDIA camera framework to control camera in the new release.

Hi,

Thanks for Your response.

<> In the coming bsp release, we can support both NVIDIA ISP framework and standard V4l2 framework in parallel, thus there is no need to do compile time configuration at that moment.

DO you mean Release 24.2 ?.. When can we expect it to be released approximately?

<> We will provide low-level APIs based on NVIDIA camera framework to control camera in the new release.

Could You please explain exactly what kind of control?

Thanks
Palani

It should be coming in one or two weeks IF everything goes smooth. As to the API, please excuse that we can’t tell more details before we publish it.

Okay Thanks for the update.

Hi Conan,

<> In the coming bsp release, we can support both NVIDIA ISP framework and standard V4l2 framework in parallel, thus there is no need to do compile time configuration at that moment.

Please confirm whether the above support is available in R24.2.

Thanks
Palani

It should be supported by the latest release R24.2.

Thanks for the update.

Can You also please Let us know if this R24.2 has support for connecting more than one Bayer sensor.

Thanks
Palani

Hi,

As you mentioned R24.2 can support both NVIDIA ISP framework and standard V4l2 framework in parallel.

Could you kindly help two questions below:

  1. Which framework is used for TX1 EVB baord (with an OV5693 camera) on R24.2? (updated image by Jetpack2.3)
  2. Can I enable ISP support for OV5693 on TX1 with R24.2? Is there any document or sample?

Thanks

Hi
On Tx1 with R24.2, can Ov5693 be tested by Gstreamer with v4l2src?
Tx1 failed to preview by the following command, as I tested it on a TX1 EVB board. Did I missed any?

gst-launch-1.0 v4l2src device="/dev/video0" ! “video/x-raw, width=640, height=480, format=(string)I420” ! xvimagesink -e

(PS: not working with format= NV12 neither)

Below is the error message.

ubuntu@tegra-ubuntu:~$ gst-launch-1.0 v4l2src device="/dev/video0" ! “video/x-raw, width=640, height=480, format=(string)I420” ! xvimagesink -e
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Setting pipeline to PLAYING …
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
EOS on shutdown enabled – waiting for EOS after Error
Waiting for EOS…
^Chandling interrupt.
Interrupt: Stopping pipeline …
Interrupt while waiting for EOS - stopping pipeline…
Execution ended after 0:00:10.235591933
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …

It does support as long as you have enough bandwidth for them.

v4l2src doesn’t have integrated ISP support to convert raw data to I420 stream, so you can use nvcamerasrc instead.

v4l2src doesn’t have integrated ISP support to convert raw data to I420 stream, so you can use nvcamerasrc instead.

Hi Conan,

Thank you for your help.
yes. nvcamerasrc works.

Thanks

Hi

How can I use ISP properties (such as white balance) on TX1?
Used gst-launch-1.0 nvcamerasrc command with wbmode= did not work as I tested it on a TX1 with R24.2.

Below is the command I tested.
gst-launch-1.0 nvcamerasrc fpsRange=“30.0 30.0” ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1’ ! nvtee ! nvvidconv flip-method=2 ! ‘video/x-raw(memory:NVMM), format=(string)I420’ ! wbmode=1 ! nvoverlaysink -e

from gst-inspect-1.0 nvcamerasrc:
wbmode : White balance affects the color temperature of the photo
flags: readable, writable
Enum “GstNvCamWBMode” Default: 1, “auto”
(0): off - GST_NVCAM_WB_MODE_OFF
(1): auto - GST_NVCAM_WB_MODE_AUTO
(2): incandescent - GST_NVCAM_WB_MODE_INCANDESCENT
(3): fluorescent - GST_NVCAM_WB_MODE_FLUORESCENT
(4): warm-fluorescent - GST_NVCAM_WB_MODE_WARM_FLUORESCENT
(5): daylight - GST_NVCAM_WB_MODE_DAYLIGHT
(6): cloudy-daylight - GST_NVCAM_WB_MODE_CLOUDY_DAYLIGHT
(7): twilight - GST_NVCAM_WB_MODE_TWILIGHT
(8): shade - GST_NVCAM_WB_MODE_SHADE
(9): manual - GST_NVCAM_WB_MODE_MANUAL

Thanks

The wb mode is the property of nvcamerasrc, so you can use it like,

gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" wbmode=1 ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! nvtee ! nvvidconv flip-method=2 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -e

Hi Conan,

nvcamerasrc and v4l2src are both supported in R24.2 with OV5693.
nvcamerasrc can process bayer sensor with tegra isp,and v4l2src bypass the tegra isp,
but i want to get the bayer sensor’s raw data from v4l2src with gstreamer so that i can test my own isp.
I have tried this command but get nothing.
gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=1 ! ‘video/x-bayer,format=rggb,width=1280,height=720’ ! identity silent=false ! filesink location=test.bayer
Is there somrthing wrong? But I can use the nvcamerasrc to display.
I gusee that nvcamerasrc and v4l2src can not work at the same time because nvcamerasrc uses NVMM to map,which is not the same with v4l2src.so v4l2src get nothing.

It should be nothing to do with NVMM. You can have a try with the other io-mode.

IO mode ? What should i do? I have nothing about io mode .

Hi,

I am using Leopard Imaging IMX185 camera connected to Tegra TX1 development kit using Leopard Imaging adapter. My jetpack version is JetPack-L4T-2.3.1 and kernel version is 24.2.1

I am able to capture from the camera using nvgstcapture-1.0 and also using the nvcamerasrc element in a gstreamer pipeline.

However, our requirement is to capture the RAW data from the sensor (we will be eventually moving to a monochrome sensor). When I try to capture using yavta, I get all black frames (content is 0x0). I changed yavta to fill in a “0xab” pattern in the frames before capture, and I see that the frames have the same content after capture, i.e. no data got written to the frames.

Could you please tell me:

  1. What could be causing the v4l2 direct capture failure?
  2. Is it possible to use nvcamerasrc element to capture raw bayer data (i.e. the ISP should do nothing)


Here is more information about my setup:

Jetson TX1 Development Kit
Jetpack 2.3.1, Linux kernel tegra-24.2.1

IMX 185 Leopard Imaging camera connected to J21 connector of TX1 via Leopard imaging MIPI connector

  • nvcamerasrc works:

gst-launch-1.0 nvcamerasrc fpsRange=“30 30” ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1’ ! filesink location=file.raw

OR

nvgstcapture

–> This works
–> I put logs inside drivers/media/platform/tegra/camera/channel.c, this goes via bypass path, so VI bypass works

  • v4l2 capture does not work

v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=RG12 --stream-mmap --stream-count=1 -d /dev/video0 --stream-to=test.raw

OR

./yavta /dev/video0 -c3 -s1920x1080 -fSRGGB10 -I -Fov.raw

–> This does not work. This goes via the tegra-video v4l2 path. I put logs in the kernel driver and found that queue frame -> soc_channel_capture_frame -> updating v4l2 timestamps all happens at expected frame rate, but received frames have no data.