Developing your own camera module for Jetson TX1

Hi Guys,

Sorry if this has been asked before but I haven’t yet found material that quite answers my questions.

I am an experienced programmer in embedded systems, but I must admit that my experience with Linux is limited and I am new to the Jetson platform.

The TX1 platform is ideal for our needs but sadly (as far as I can tell) the camera that is included with it is not. Specifically, we need high frame rate photograpy that is synchronised with flashing LEDs. We then need GPU code to operate on sets of these images to produce a lower frame rate stream of images.

Can anyone briefly outline the steps necessary to go from designing our own camera to writing code that will operate on data in Linux. I am comfortable with the PCB design and interfacing it all, its mostly the software domain where my knowledge gets fuzzy, but maybe there are special/ not obvious requirements on the interfaces on the Jetson that I may not have appreciated.

My thoughts for synchronising the LEDs with the capture was to run the sensor in a slave mode and have a microcontroller local to the sensor board triggering both the exposures and the LEDs, so the only thing I need to do really is make sure I can catch the data coming from the sensor once an exposure is complete (and maybe a little fiddling making sure I know which image is the first of a ‘set’).

At this stage I don’t think I need the absolute most optimal solution as the GPU is far in excess of the processing power we require, but who knows where we might try to go?

Any thoughts on the feasibility of this and things to watch out for are greatly appreciated. I’m facing a wall of information I’m quite prepared to start digging through, but I think the chance of success is far greater if people can point me in the right direction.

Thanks guys,



If you look in the “Video for Linux User Guide” section of the NVIDIA Tegra Linux Driver Package Development Guide I believe you’ll find the documentation necessary to add your own camera.

Gstreamer provides a videorate element you can use to control the frame rate of your video stream. If your sensor has a strobe or flash output you can use that to signal the LED turn-on.

A detail I should probably have included. It is highly likely the exposure time will change based on the particular LED illuminated, possibly by large amounts. Is the current plug and play stuff up to coping with this?