TX2 video input that is not an i2c device.

We have a video input board where the ‘cameras’ i.e. video input streams
are not controlled directly by a single i2c controller (there are several
spi and i2c devices and intermediate circuitry). The video eventually
ends up at the csi input ports on the TX2 module. I am looking at the
“Sensor Driver Programming Guide” and the example ov4693.c driver and it
all seems very tightly tied to i2c devices. Many of the calls (i.e
v4l2_i2c_subdev_init() and camera_common_parse_ports()) expect a
struct i2c_client, but since our board is not a single i2c device this
doesn’t really exist. Even struct camera_common_data (camera_common.h) expects
a struct i2c_client.

I’m not sure if I need to search for other registration calls, or provide
a fake i2c client structure. If I have to create a dummy i2c client structure
it is unclear how many fields need to be filled in.

Ideas?

Thanks.

Hi cobrien,

Is your video input source a CSI device? If so, I2C (also called CCI in CSI spec) is combined with CSI, the CSI device should have it. If not, why the video ends up at the CSI ports?

@cobrien
You can use the reference driver ov5693 as template but replace the i2c access to SPI like below.
Of course you still need add initial spidev in you sensor driver.

struct regmap			*regmap;
        struct regmap			*spi_regmap;


        err = regmap_write(priv->spi_regmap, addr32, val);
static int imx172_spidev_init(struct imx172 *priv)
{
	struct spi_master *spi_master;  
    struct spi_device *spi_device;
    char buff[64];
    int status = 0; 
    struct device *pdev;  
    int busnum = 1;
    int csnum = 1;
    char drv_name[] = "IMX172";
	static struct regmap_config spi_regmap_config = {
		.reg_bits = 24,
		.val_bits = 8,
		//.cache_type = REGCACHE_NONE,
	};

    priv->spi_regmap = NULL;

   

    spi_master = spi_busnum_to_master(busnum); 
    if (!spi_master) {  
    	printk(KERN_ALERT "spi_busnum_to_master(%d) returned NULL\n", busnum );  
   		printk(KERN_ALERT "Missing modprobe omap2_mcspi?\n");  
    	return -1;  
    }  

    spi_device = spi_alloc_device(spi_master);  
    if (!spi_device) {  
        put_device(&spi_master->dev);  
        printk(KERN_ALERT "spi_alloc_device() failed\n");  
        return -1;  
    } 
    /* specify a chip select line */  
    spi_device->chip_select = csnum;

    /* Check whether this SPI bus.cs is already claimed */  
    snprintf(buff, sizeof(buff), "%s.%u",dev_name(&spi_device->master->dev),  
            spi_device->chip_select); 
    

	pdev = bus_find_device_by_name(spi_device->dev.bus, NULL, buff);
	if (pdev) {  
        /* We are not going to use this spi_device, so free it */   
        spi_dev_put(spi_device);  
  
        /*  
         * There is already a device configured for this bus.cs combination. 
         * It's okay if it's us. This happens if we previously loaded then  
                 * unloaded our driver.  
                 * If it is not us, we complain and fail. 
         */  
        if (pdev->driver && pdev->driver->name &&   
                strcmp(drv_name, pdev->driver->name)) {  
            printk(KERN_ALERT   
                "Driver [%s] already registered for %s\n",  
                pdev->driver->name, buff);  
            status = -1;
            return status;
        }   
    } else {
    	static struct tegra_spi_device_controller_data cdata;

		cdata.is_hw_based_cs = 1;
		cdata.variable_length_transfer = 0;
		cdata.cs_setup_clk_count = 0;
		cdata.cs_hold_clk_count = 0;
		cdata.rx_clk_tap_delay = 0;
		cdata.tx_clk_tap_delay = 0;
		cdata.cs_inactive_cycles = 0;
		cdata.clk_delay_between_packets = 0;
		cdata.cs_gpio = -1;
    	
        spi_device->max_speed_hz = 1000000;  
        spi_device->mode = (SPI_MODE_0|SPI_LSB_FIRST|SPI_CPOL|SPI_CPHA);
        spi_device->bits_per_word = 8;  
        spi_device->irq = -1;  
        spi_device->controller_state = NULL;  
        spi_device->controller_data = &cdata;  
        strlcpy(spi_device->modalias, drv_name, SPI_NAME_SIZE);  
        status = spi_add_device(spi_device);  
  
        if (status < 0) {      
            spi_dev_put(spi_device);  
            printk(KERN_ALERT "spi_add_device() failed: %d\n",   
                status);          
        }else{
			printk(KERN_ALERT "spi_add_device(%s)=%s pass \n",buff,drv_name);
        }
    }  
  
    put_device(&spi_master->dev); 


	priv->spi_regmap = devm_regmap_init_spi(spi_device, &spi_regmap_config);

	if (IS_ERR(priv->spi_regmap)) {
		dev_err(&spi_device->dev,
			"regmap init failed: %ld\n", PTR_ERR(priv->spi_regmap));
		return -ENODEV;
	}

    return status;  
            
}

Again, the problem is the frame capture board is not controlled by a single spi device either.
The problem I am facing is that I see only v4l2_i2c_subdev_init.

Right now I create a non-existant i2c device on an existing bus and use that, which gets
me through the _probe function, so I can call v4l2_i2c_subdev_init() and v4l2_async_register_subdev(),
but I’m not seeing the video device yet. I need to look at the device tree entries next.

I’d really like to use the camera framework because it seems to handle the csi frame grabbing and
v4l2 integration automatically provided it’s configured properly in the device tree.

We faced the same problem. It seems as if i2c device is only needed for debugging output inside the nvidia driver. The debug functions used within the nvidia driver need a device. We created a dummy i2c structure containing the nessessary fields.

Hi, Trumany.

I’m facing this problem too. I have a pre-configured video stream connected to TX2 CSI port without i2c, how can we grab the frame data?

The media framework seems to expect an i2c device, so I created one that referenced
an unused address on one of the i2c ports. The actual video capture and conversion
to CSI is done by several chips and an fpga controlled by several i2c and spi bus
connections. There are actually four streams captured. This is controlled outside
of the i2c device that is actually connected to the csi in put and vi processing
unit in the device tree.

In any case, I have /dev/video0 to /dev/video3, and media-ctl -p shows the proper
linkage with the CSI ports. Unfortunately, trying to capture data with v4l2-ctl

v4l2-ctl -d /dev/video0 --stream-mmap --stream-count 10 --stream-to frames.raw

does configure the CSI inputs, but no data comes in. Usually I get

[ 2047.241801] tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11

About 1/3 of the time I also get

[ 2046.235864] cti_scout_sd_in 2-0001: sim_sd_in_s_stream++ enable:1
[ 2046.245225] tegra-vi4 15700000.vi: Status: 4 channel:00 frame:0000
[ 2046.251493] tegra-vi4 15700000.vi: timestamp sof 2055892785024 eof 2055892793984 data 0x00000200
[ 2046.261053] tegra-vi4 15700000.vi: capture_id 28 stream 0 vchan 0
VIDIOC_DQBUF: failed: Input/output error

We’re looking carefully at the CSI output stream. What has to be configured differently
in the TX2 CSI (or VI) subsystem, if anything, is unclear.

Any ideas would be appreciated.

Thanks,

Cary

@cobrien

As I understanding,

In order to get the /dev/video* , you made four virtual i2c devices to register the video sources as /dev/video*?

Exactly. Then I had to create the device tree entries per the camera developers guide in order
to link these into the camera/media framework.

IT is still not working however, I almost always get

tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11

Hope this topic could help u:

https://devtalk.nvidia.com/default/topic/1007058/?comment=5206727

I think I’m in the same situation with a generated CSI video stream input, but still cannot get /dev/video* for the video source.

Could u share us more details about how to get /dev/video*s , like how to add video configurations to the existing drivers and
device tree entries to make /dev/video available?

Thanks a lot!

For the ‘tegra-vi4 15700000.vi: PXL_SOF syncpt timeout! err = -11’ that could be the MIPI signal or the output size not as expect.

Hello, did you solve this out?

I think I’m facing the some problem as you now.

We are still struggling with this. The CSI converter we use can generate a test pattern,
that we can capture. But actual video frames are failing to be captured by the CSI
subsystem. Our understanding is that the CSI input section needs to have the exact frame
size (width and height) or it will fail to capture. The relationship between the active
video and any extra blanking lines/columns is what we are investigating now.

Another problem was adding the correct frame size entries so that v4l2-ctl --all showed
the proper resolution and video format. Without this several things failed – the v4l2
device registration failed (so /dev/video* wasn’t created), or the resolution wouldn’t be
shown in v4l2-ctl --all. we had to change sensor_common.c:extract_pixel_format(), and
add an entry in camera_common.c:camera_common_color_fmts. But again, if you are getting
a /dev/video* and v4l2-ctl --all shows the right formats you are beyond this.

Another thing to check is that media-ctl -p shows the proper connections between camera,
csi, and vi subdevices.

Cary

Hi, Cary.
Thanks for your tips, and now we can get /dev/video0 node. But I am not sure the device tree node params were correct.

The mipi signal is a 1080p 60fps, yuv422-8bit stream. How should I set these params’ values in device tree node?
Here are the results of media-ctl -p and v4l2-ctl --all

nvidia@tegra-ubuntu:~$ sudo media-ctl -p
Media controller API version 0.1.0

Media device information
------------------------
driver          tegra-vi4
model           NVIDIA Tegra Video Input Device
serial          
bus info        
hw revision     0x3
driver version  0.0.0

Device topology
- entity 1: 150c0000.nvcsi-2 (2 pads, 2 links)
            type V4L2 subdev subtype Unknown flags 0
            device node name /dev/v4l-subdev0
	pad0: Sink
		<- "ov5693 2-0036":0 [ENABLED]
	pad1: Source
		-> "vi-output, ov5693 2-0036":0 [ENABLED]

- entity 2: ov5693 2-0036 (1 pad, 1 link)
            type V4L2 subdev subtype Sensor flags 0
            device node name /dev/v4l-subdev1
	pad0: Source
		[fmt:SRGGB10/1920x1080 field:none]
		-> "150c0000.nvcsi-2":0 [ENABLED]

- entity 3: vi-output, ov5693 2-0036 (1 pad, 1 link)
            type Node subtype V4L flags 0
            device node name /dev/video0
	pad0: Sink
		<- "150c0000.nvcsi-2":1 [ENABLED]


nvidia@tegra-ubuntu:~$ v4l2-ctl --all
Driver Info (not using libv4l2):
	Driver name   : tegra-video
	Card type     : vi-output, ov5693 2-0036
	Bus info      : platform:15700000.vi:2
	Driver version: 4.4.38
	Capabilities  : 0x84200001
		Video Capture
		Streaming
		Extended Pix Format
		Device Capabilities
	Device Caps   : 0x04200001
		Video Capture
		Streaming
		Extended Pix Format
Priority: 2
Video input : 0 (Camera 2: no power)
Format Video Capture:
	Width/Height      : 1920/1080
	Pixel Format      : 'RG10'
	Field             : None
	Bytes per Line    : 3840
	Size Image        : 4147200
	Colorspace        : sRGB
	Transfer Function : Default
	YCbCr Encoding    : Default
	Quantization      : Default
	Flags             : 

Camera Controls

                   frame_length (int)    : min=0 max=32767 step=1 default=1984 value=1984 flags=slider
                    coarse_time (int)    : min=2 max=32761 step=1 default=1978 value=1978 flags=slider
              coarse_time_short (int)    : min=2 max=32761 step=1 default=1978 value=1978 flags=slider
                     group_hold (intmenu): min=0 max=1 default=0 value=0
                     hdr_enable (intmenu): min=0 max=1 default=0 value=0
                       otp_data (str)    : min=0 max=1024 step=2 value='a0b9e7ecc1ffffffcc2a0f00c0ffffff20c22201c0ffffff0000000000000000c0b9e7ecc1ffffffac480f00c0ffffff1800000000000000000000000000000060bae7ecc1ffffff504c0f00c0ffffff184c0f00c0ffffff40effd00c0ffffff00000000000000000077b2eac1ffffff880803e6c1ffffff580803e6c1ffffffc80e03e6c1ffffffeaffffff000000002024b4ebc1ffffff18a202e6c1ffffffd8d73e01c0ffffff000000000000000070bae7ec00000000400000000000000030bbe7ecc1ffffff30bbe7ecc1fffffff0bae7ecc1ffffffc8ffffff0000000090bae7ecc1ffffff28121700c0ffffff30bbe7ecc1ffffff30bbe7ecc1fffffff0bae7ecc1ffffffc8ffffff0000000030bbe7ecc1ffffffc4fd7600c0ffffff01000000000000000077b2eac1ffffff30bbe7ecc1ffffff30bbe7ecc1fffffff0bae7ecc1ffffffc8ffffff0000000030bbe7ecc1ffffff30bbe7ecc1fffffff0bae7ecc1ffffffc8ffffff0000000060bbe7ecc1ffffffb0fd7600c0ffffff0100000000000000000000000000000000000000000000001500000000000000bc77b2eac1ffffff000000000000000040bbe7ecc1ffffff38097800c0ffffff80bbe7ecc1ffffff78087700c0ffffff2024b4ebc1ffffff180803e6c1ffffff580803e6c1ffffff0020c100c0ffffff88c92c01c0ffffff0024b4ebc1ffffff' flags=read-only, has-payload
                        fuse_id (str)    : min=0 max=16 step=2 value='4cdc7500c0ffffff' flags=read-only, has-payload
                           gain (int)    : min=256 max=4096 step=1 default=256 value=256 flags=slider
                    bypass_mode (intmenu): min=0 max=1 default=0 value=0
                override_enable (intmenu): min=0 max=1 default=0 value=0
                   height_align (int)    : min=1 max=16 step=1 default=1 value=1
                     size_align (intmenu): min=0 max=2 default=0 value=0
               write_isp_format (int)    : min=1 max=1 step=1 default=1 value=1

Hi Cary,

I am following up on this thread for my own purposes. Did you get this working with 2 different CSI streams (i.e. that map to 2 different device nodes)? I have a similar situation where we have a pre-configured data stream coming from an FPGA and I simply want to be able to capture data over CSI using V4L2. Is there anyway you could share the snippits of your DTSI files that made this work? Did you use the OV5693 DTSI file (e3326 board), and were you able to extend this to support multiple input streams or “cameras”? And what changes do you need to make in the device driver to make sure that the don’t attempt to interface over i2c?

Any help would be greatly appreciated!

aruby