Getting Started with Autonomous Driving using Jetson Xavier (Solved)

Here’s a link to my video showing Jetson Xavier driving a vehicle in Unreal Engine using Project Redtail, Ardurover SITL, AirSim, and ROS.

The vehicle is being controlled by the Redtail neural network with only a single camera stream from UE4 as input.

For those who are interested in trying the project, I’m finishing up the wiki doc with instructions for setup and use. Also have a couple more files to push to my PR on the Redtail github site. I’ll update later today when it’s ready for testing.

Now turn up the fog and place the sun right in the middle of the horizon!

The cool thing with simulated environments is that you can easily do things like that.

The bad thing is that most such environments don’t have good simulations of the actually-hard behaviors, like pedestrians, road crews, broken traffic lights, bicyclists crossing the road and so forth…

This is fantastic. Thank you for your efforts. I’ve been meaning to replicate this project onto one of my drone platforms.

This UE4 environment has a few different vehicles. It’s got the AirSim SUV and Quadcopter, plus an off-road buggy, and a golden eagle. Flying vehicles can be controlled by joystick or API, land vehicles with keypad or API. The sun position and lots of other things can be set with the AirSim settings.json file

Here’s a link to the Redtail environment for Win64.

The Win64 has Nvidia VXGI, and runs high frame rate at 4K on RTX 2080. Should be ok at 1080P on a 1070 or better.

If you want to fly or drive in VR with HTC Vive, make sure SteamVR is running first, launch the .exe, then after the sim is loaded, you go to the console by pressing “~” then type vr.bEnableStereo =1

This is the Linux binary, it runs much slower. You’ll need to run it at lower resolution even with a fast GPU

Here’s an incomplete rough draft of the wiki for anyone that wants to get started setting it up with Xavier, and Redtail. I’ll try to get it finished later tonight.

Is the view shown what the Jetson Xavier sees?

I’m hoping that it’s different because it’s unfair to even a human driver to have the buggy in the way of the view. :-)

No that’s not the view the neural network is seeing. You can switch views that are shown on screen, and you can also change the camera parameters that are published by the ROS node.

Press “I” in game to get the front center camera view. That’s what the neural network is seeing, and that’s usually what I like for driving or flying.

You can type “F1” to get a list of AirSim commands.

It’s a development build, so you can access the console with “~” to enter UE4 console commands to change render settings, etc.

In the video, the front cameras are too low on the vehicle for driving in an outdoor environment. In the binaries I posted above, the cameras are moved 1 meter higher than they were in the recorded video. Moving the cameras higher improved the driving performance of the neural network since they are now around the height the network was trained at. Previously too much of the view was being obstructed by grass and other plants.

Thanks for the info, and for your work on this project. I can’t wait to try it out.

I have a Rover that I built using the Jetson-Hack tutorials so this is definitely something that will come in handy for training of a network to use with it.

It’s currently built around the TX2, but I plan on replacing that with the Xavier.

This is great. I translated this project into Japanese. thank you for your effort.