I’ve had my Jetson for a little while now - I have set it up using Grinch, and I can remote connect to it and so on using my home network. Now it’s time to start using it for what I bought it for!
I thought I’d start a topic on what can be done with Jetson in the area I am interested in - autonomous vehicles. My background is that of an aircraft structures engineer, rather than a coder. I have dabbled in C++ and Python before and am ready to learn!
I’d like to use my Jetson as part of an autonomous vehicle architecture (The task I have taken on is: I want a standard system comprised of a number of components that can be put in to a vehicle regardless of type of vehicle, so land sea or air). I see the jetson handling high level functions like vehicle management, navigation including delivering commands to the autopilot device, and collision avoidance using onboard sensors to build up a virtual volume to work within.
My current knowledge limitation is centred around how I would organise the more heavy duty task of taking in sensor data (such as LIDAR and IR Cam) and processing it to create a useable volume within which algorithms can work out optimal paths for goals and ‘pop-up’ obstacles. Can I run this whole process on the GPU? Or is it a case, that using CUDA, ‘batches’ of data are sent to the GPU for processing and results sent back to the process running on the CPU?
I guess that’s the current limit of my knowledge - how does the code and the board hardware co-exist?
I hope this can spark a bit of discussion, or if I’m totally in the wrong place, someone can show me the way!