Jetson Nano 640 x 512 Thermal Camera

I have a FLIR CMOS thermal camera (640x512). I convert pixels to CSI-2 format with Lattice FPGA and then I want to transmit from DPHY layer to jetson nano. I don’t want to write code for Jetson Nano I can only configure it. But I have some unclear questions.

  1. Does Jetson Nano have 640x512 pixel camera support?
  2. My camera works with UART, but I can convert it to I2C. How does Jetson know if the camera is connected? Is it looking at the I2C line?

hello stark456,

no, at lest you’ll need device tree implementation to indicate port bindings.

please access Jetson Nano Product Design Guide, there’s camera connector, and you should sending the MIPI signaling to CSI.
I2C for device registration and communication, you could see-also reference camera drivers for details.

640x512, this resolution is supported, but, what’s the pixel formats?

  1. Hi again @JerryChang thank you for your quick reply. My camera outputs in YUV 4:2:2 and RAW 8 format.
    I couldn’t see 640 x 512 on Jetson TX2 - nvgstcapture-1.0 so I’m confused. I am sharing the chart below.
    Supported resolutions in case of CSI Camera
    (2) : 640x480
    (3) : 1280x720
    (4) : 1920x1080
    (5) : 2104x1560
    (6) : 2592x1944
    (7) : 2616x1472
    (8) : 3840x2160
    (9) : 3896x2192
    (10): 4208x3120
    (11): 5632x3168
    (12): 5632x4224
    I don’t know Jetson Nano and my time is limited so I just want to activate the system with certain commands. I don’t have the experience and knowledge to write a driver on the Jetson side.
  2. Is there a simple command just to query the existence of the camera? I’m guessing Jetson is checking the presence of the camera only if it has an I2C connection.

hello stark456,

instead of nvgstcapture-1.0, you may use gstreamer with nvarguscamerasrc, it’s gst pipeline to use commands to set image formats.
for example,
$ gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1920, height=1080, framerate=30/1, format=NV12' ! nvoverlaysink -ev

that’s incorrect. Nano series consider Raspberry Pi, IMX219 as default camera sensor.
you’re able to have [Configure Jetson Nano CSI Connector] through Jetson-IO to select different camera modules, however, that’ll need device tree overlay (*.dtbo) present.
i.e. tegra210-p3448-all-p3449-0000-camera-imx219-dual.dtbo , tegra210-p3448-all-p3449-0000-camera-imx477-dual.dtbo, or, tegra210-p3448-all-p3449-0000-camera-imx477-imx219.dtbo.

I am getting MIPI CSI output from TC358746 IC bridge and I need to connect this output to jetson nano. I just want to watch the image as a video, I don’t need to do any processing.

  1. Is special driver required for TC358746? Can I use drivers for IMX cameras?
  2. RidgeRun’s driver for TC358746 appears to be available, but there is no download link. Is this driver paid, do you know? (V4L2 Drivers for NVIDIA Jetson | Linux V4L2 Driver | Ridgerun)
  3. You said that device tree overlay (*.dtbo) should be found. I think it is difficult to write or configure this file for someone who does not know linux at all. But if it’s easy and can be handled just by changing some parameters I can try to do it.

I have no experience with linux and jetson, so I would appreciate if you guide me how to get the image in the easiest way.

hello stark456,

why don’t you use usb cameras, it’s plug-n-play.

please contact with camera partners. you may see-also Jetson Partner Supported Cameras for cameras supported by Jetson Camera Partners on the Jetson platform.

Unfortunately I can’t change the camera.

hello stark456,

it’ll need driver implementation,
please refer to tutorials page, you may check V4L2 Sensor Driver Development Tutorial for reference,

@JerryChang Well, if I write an I2C device on the FPGA side without using a bridge and set its address like an IMX sensor, can we fool Jetson? Because I don’t have the knowledge and time to write a driver myself.

hello unknown987,

it should works in theory.
please also check reference camera driver, tegra210-camera-rbpcv2-imx219.dtsi, it must use the same port binding and same i2c address.
however, as you can see, there’s compatible property in device tree.

                i2c@546c0000 {
                        imx219_single_cam0: rbpcv2_imx219_a@10 {
                                compatible = "nvidia,imx219";

from low-level kernel side, it’s using this property for mapping to its sensor driver.

static const struct of_device_id imx219_of_match[] = {
        { .compatible = "nvidia,imx219", },

in other words, you should at least have device tree changes.

Can you briefly take a look at this link? Simulated IXM219 to FPGA side without any sensors. Actually there is no camera and it worked.
Here is the configuration as follows. Where exactly does this configuration come from? What command or library does it come to this page with?

mode0 { /* IMX219_MODE_3264x2464_21FPS */
mclk_khz = "24000";
num_lanes = "2";
discontinuous_clk = "yes";
active_w = "3264";
active_h = "2464";
pixel_t = "bayer_rggb";
line_length = "3448";
pix_clk_hz = "182400000";
min_framerate = "2000000"; /* 2.0 fps */
max_framerate = "21000000"; /* 21.0 fps */
min_exp_time = "13"; /* us */
max_exp_time = "683709"; /* us */
embedded_metadata_height = "2";

hello unknown987,

they are sensor hardware settings, you may check sensor spec to understand those settings.
driver side using device tree to retrieve everything from device tree, they could be hardware specific settings, sensor specific settings, v4l2 media controller settings, and, camera platform settings.

@JerryChang Link : CSI-2 Image Simulator Lattice CrosslinkFPGA to Jetson Nano -

hello unknown987,

what’s the main question here?

Hello ,
In the link I shared above (CSI-2 Image Simulator Lattice CrosslinkFPGA to Jetson Nano -, someone sent an image to the nano without any image sensor. For this, he says:

"Before you boot up the Jetson Nano, the lattice crosslink board NEEDS to be connected and programmed. At bootup the Nano searches the I2C bus for connected image sensors from the list of image sensors. If it doesn’t find anything at bootup you wont be able to launch the sensor (real or simulated).

If when you boot up these imx219 messages occur on the screen then you have issues When you have Nano booted up, open a terminal and run

 DISPLAY=:0.0 gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=3280, height=2464, format=(string)NV12, framerate=(fraction)20/1' ! nvoverlaysink -e

This will cause the Nano to write to the lattice to start streaming and hopefully images should appear on the screen."

My camera has a resolution of 640 x 512 and a frame rate of 50 Hz. If I change and write these configurations in the code above, can I send the image too?

hello unknown987,

please make sure you have MIPI signaling sending to CSI brick, and please go ahead to test the stream.
you’ll need to tune the settings to enable the stream, this is life.
please see-also reference HDMI2CSI chip driver, tc358840.

1 Like

Hi, last time we talked I was going to try to use a different camera sensor as if it was the IMX219 sensor. For this, you suggested nvarguscamerasrc. But my camera supports 8 or 14 bit CMOS, and YUV 4:2:2 pixel format. Does this library support these formats? Or is there any other library that supports it?

hello unknown987,

for YUV camera sensor, you may enable applications using gStreamer with V4L2 Source Plugin.
for example,
$ gst-launch-1.0 -v v4l2src device=/dev/video1 ! video/x-raw,framerate=30/1,width=640,height=480 ! xvimagesink

please see-also developer guide, Approaches for Validating and Testing the V4L2 Driver.

What about for RAW 8? I think I know that IMX219 gives RAW 8 output.

hello unknown987,

no, that’s incorrect, Raspberry PI v2, imx219 is outputting Raw10, pixel formats with RG10.