Openpilot Advanced Driver-Assistance System (ADAS) on Nvidia Xavier NX

Hi all,

I’m here to share a project I’ve been working on over the last couple of months:- Running commaai openpilot on Jetson Xavier NX.

openpilot is an opensource ADAS solution by commaai (founded by George Hotz) and is rated No. 1 by consumer report

This build is based on dragonpilot - an openpilot fork that I’ve been maintaining for over 2 years.

Link:

Demo:

Feel free to comments or ask any questions :)

cheers,

Rick

2 Likes

Excuse my ignorance but I am not getting what this system does. I watched both youtube videos and I could not figure it out. Thanks

Hi @mhm_nab,

Sorry I should have explained a bit more :)

Openpilot is a L2+ ADAS system meaning it provides advanced LCA (Lane Centering Assist)、LDW (Lane Departure Warning)、ACC (Adaptive Cruise Control) and Driver Monitor for support cars.

The videos doesn’t really showcase everything due to my car’s limitation, but if you search “openpilot” you will see lots of other videos (such as $7,000 Tesla Autopilot vs $1,000 Openpilot: Self-Driving Test! - YouTube)

cheers,

Rick

2 Likes

Thanks Rick. Now it is clear. Great work.

I noticed that the steering happens automatically. Is that correct? Also, what happens when you clicked a couple of times on the screen and the colour for LCA changed between green & red?

1 Like

First of all please keep it in mind the video example is running a customised version of openpilot. (adds tons of features)

In the stock openpilot, it has 2 mode you can choose from, lane line or laneless.
The lane line mode uses lane lines to predict drive path. (green)
The laneless mode is end to end drive path prediction. (red)

with the customised version you can change the mode on the fly, and there is an “AUTO” mode where it switch to laneless mode automatically when lane line prediction is lower than certain threshold.

laneless mode is better at predicting drive path when there is no lane line marks.

2 Likes

Hi @ricklan_nvidia, this is super cool!

I’m curious: what hardware do you perform model inference on? (CPU/GPU/DLA)?

Hi @tomasz_lewicki ,

Thanks for your comment! I’m no expert on this but I have explicitly set to use CUDAExecutionProvider if it’s available. (and it does set to CUDAExecutionProvider :) )

1 Like

Hi Rick,

Please provide hardware connection block diagram between Jetson Xavier NX, Comma.ai panda, camera and car.

I have white panda. What code customization required?

Can I use Rapberry Pi v2 camera instead of Arducam IMX 477 camera?

I am also interested to try it out.

Thanks,
sind-hem

Hi @sind-hem ,

Sorry for the late reply as I had trouble signing in over the last few days :(

Please see the diagram below:

if you only have a white panda then you probably need a giraffe so it can be set in between the ADAS CAMERA and CAR to intercept and modify the control messages.

you can still do it without a giraffe but you will need to figure out the wiring yourself.

The stock openpilot no longer supports white panda so you will need to modify selfdrive/boardd/panda.cc to bypass checking, I’ve already done it in the xnxpilot but I only tested in on bench so can’t guarantee anything.

Any camera will do I suppose as long as it’s 1080P or above. I’ve tried IMX219 and IMX477 but I ended up keeping IMX477 because of its lens. The one on IMX219 doesn’t have enough FoV (I think) and causing lots of calibration issues.

1 Like