Low-latency camera for Jetson TX2

Hi dear friends,

If you have any measurements or information regarding Jetson cameras real-time video input, would you share any results?

  • Input latency, ideally camera should hold at most 1 frame internally, but also driver, camera SDK can delay the input. If you measured input lag for some camera, would you share the results?
  • ARM CPU load, ideally we receive the frame using DMA and it doesn’t mess with CPU or GPU processing (ok, except for some DRAM bus load), but in reality camera SDK or driver could be processing on CPU. If you have any measurements, may you post it?
  • Frame rate, the higher, the better.
  • Latency jitter, ideally the input lag should be constant, but what happens in real world?

I will appreciate any information or experience about high-fps and low-latency video processing on Jetson.

Here is a post about glass-to-glass latency of onboard ov5693

DaneLLL, thank you for the link!

But… such huge latency, 50-70 ms, 5 frames at 60 fps, is it true?

Maybe we can improve it somehow? Otherwise it undermines Jetson primary usage for real-time vision and control.

Hi @Hexagonal

You may find interesting the following link:

This wiki page is intended to be used as a reference for the Tegra X1 (TX1) capture to display glass to glass latency using the simplest GStreamer pipeline. The tests were executed with the IMX274 camera sensor, for the the 1080p and 4K 60fps modes.The tests were done using a modified nvcamerasrc binary provided by Nvidia, that reduces the minimum allowed value of the queue-size property from 10 to 2 buffers. This binary was built for Jetpack 3.0 L4T 24.2.1. Similar test will be run on TX2.

Also, It expose some Glass to glass latency measurement reliable methods.

I hope this information helps you!

Best regards

Thank you.

Wouldn’t displaying the captured picture on the screen being captured be sufficient to calculate latency?
No need for two displays or a separate PC.
(This assumes latency is a fixed multiple of the display refresh rate, which for the 60 Hz display/camera case probably is a reasonable assumption, especially when you care about “glass to glass” rather than “glass to model.”)

So currently all we know is:

RidgeRun can develop a driver for any sensor, but there is no information which sensor is good, we can’t order drivers for all possible sensors and test them all, it’s too expensive.

It’s a pity that Jetson ecosystem is in such a miserable state, no proven inexpensive camera solutions (of course Ximea PCI-E camera works good, for 1800 euros).
I compare to Texas Instruments times 20 years ago, nobody expected latency more than 1 frame, now in 2018 it’s 6-8 frames on the best HW of the best HW company (NVIDIA), what happened.

Does anybody have any information about e-Con sensors? Which latency do they have?

As I understand, Leopard and e-Con cameras have working drivers for TX2.
Interesting, these drivers should work on devkit, but will they work on Auvidea board?

Not to blame anyone, but I think it was a big mistake on Nvidia part to not support Raspberry Pi cameras on Jetson DevKit from its introduction. Everybody has Pi cameras already, they are not expensive and plenty. IMHO, for many applications 50-60ms latency is perfectly acceptable. Many are using UVC cameras on Jetson where latency is 100+ms.


You should assume that NOTHiNG will work on an Auvidea board. According to their official support response (As of 4/3/2018), CSI / MIPI cameras are not yet supported on TX2 using Auvidea carriers, and no ETA is provided.

(I may have a slight chip on my shoulder when it comes to Auvidea, but the fact remains…)


RidgeRun currently developed the following TX2 sensor drivers for MIPI devices compatible with some specific AUVIDEA boards:

-) Toshiba TC358840 Linux driver for Tegra X1 and X2 (Compatible with AUVIDEA J130 Board):

-) Sony IMX219 Linux driver for Tegra X1 and X2 (Compatible with J100 and J20 Auvidea boards):

@Hexagonal, we have been working with several MIPI sensor on both TX1 and TX2 platforms, may we can help you in finding a camera sensor that meets your requirements and develop a driver for it.

I hope this information may be helpful for you.

It’s quite ironic, since Auvidea sold more than 100 of J120 board in just last three months in USA alone (via Mouser Elec). And based on the increased demand they raised their prices too ;-) You would think that some of these monies can be used towards software/support for their products? I bet if they get a driver for B102/B110 HDMI-2-CSI converters working with TX2 and latest L4T release, they would sell many of these converter boards to the same customers ;-)


100 sold J120 boards, minus manufacturing costs and distributor discount, doesn’t even pay for a single senior engineer for a month (if you’re in the SF Bay area, but I can’t imagine Germany is 10x cheaper …)

These providers aren’t targeting mass production with their up-front prices.

Their up-front prices are supposed to cover some amount of help from their application engineers, and help defray their initial R&D costs to get something working on whatever hardware at all.

The intention here is that you use those devices to develop, and the work with the vendor (or perhaps even the vendor’s vendor) to figure out what cost would be for mass manufacture. Once you hit 100k items or more, prices will have nothing to do with what the initial purchase price for a development sensor is (and even at 10k it’s not all that comparable.)

Yeah, I’m RMAing my J100 board as soon as it shows up. I may go with an Elroy or Sprocket carrier instead, assuming someone can confirm to me that those actually work.

Anyways, sorry, didn’t mean to derail the thread with my personal problems :-$

Hi All,

In general we have seen good performance on Leopard Imaging boards and Connecttech boards, support is good from them as well. We create the drivers to work with those boards too.


Awesome! Can you link the driver / instructions for the ConnectTech boards? Looking for Sprocket or Elroy, specifically.

Hi Philippe,

We have mainly used the Astro board from connecttech with the OV10640/OV490 GMSL camera from leopard imaging as you can see in our wiki: 

I think that if you want to use free drivers you will need to stick to the camera included in the Jetson board, if you want to use an smaller carrier board you would need to buy the camera from leopard imaging (which also provide drivers for their cameras) or hire RidgeRun to create the custom driver for your custom camera. You can check with connecttech which other drivers are supported.

We offer the IMX219 driver for free but for an older JP 3.0 TX1 and this works with Auvidea J106/J20/J100. We haven’t been hired to update those drivers to TX2 and Jetpack 3.2 yet.


Thanks David, I understand. It’s unfortunate that there doesn’t seem to be a very user-friendly ecosystem around the Jetson / Camera integration story. One of the requirements of my project is unfortunately that software has to be free (And ideally GPL).

Hi Philippe,

Sorry about that, actually the drivers are GPL because the kernel code is GPL but you would need to pay someone to develop them or try to develop the driver in your own company.


How much would it cost to update IMX219 for TX2, with ISP support?
Ideally, with support for at least two of them on boards that have that?
And ideally with support for J90 and J120 carriers?
(There ought to be a way to crossbar the carrier support with the camera support …)

I’m wondering whether we should all do a kickstarter to get this off the ground …


We monitor and respond to questions on devtalk to support the community. If you need business level support and development, please visit the RidgeRun website and drop us a note. We are careful not to use devtalk to promote our business.