Autonomous Vehicle Prototype

Good morning:

I am starting to learn about this technology and I would like to know if it is possible to develop a prototype of a robot that could travel between the office cubicles avoiding obstacles and taking photos at determined points to store them in a local server database at the office. I would also like the robot to return to a specific point to recharge its batteries and return to the last point to start again.

I have a lot of doubts and I will appreciate if you can help me with the answers:

Is the development of this project feasible?
How complicated would it be to develop such an application with nvidia jetson products?
Will Nvidia Jetson nano be a good option to start?
Could I use jetbot to start?
What peripherals do I need to start?
How could I establish the route that the robot should follow?
How could I establish the recharge point?
Will it be easy to migrate this development to another nvidia hardware in a future without a problem ?

Thank you very much in advance for your comments and all your help.

Hi jcotof,

I will suggest to study the relevant technology from below web pages to get some ideas on your project:
[url]https://developer.nvidia.com/embedded/learn/getting-started-jetson[/url]
[url]https://developer.nvidia.com/embedded/learn/tutorials[/url]

Thanks

Yes.

No more complicated than other robotics projects. If you’ve never built a project, this will be very complicated. If you’ve built other robots, it would be another turn of that crank.

Assuming you get the right sensors and the right actuators and have a good mechanical platform for the robot, the NVIDIA Nano would be a very good choice.

Yes.

[/quote]

What peripherals do I need to start?

[/quote]

You’d want some beefier robot chassis than the JetBot for this to be robust to a commercial environment.
Maybe something like the Turtlebot 2i from Interbotix?
Note that you’d want to replace the computer they ship with, with the Nano.

Another option is to roll your own sensor package, for example based on the Stereolabs Zed or Zed mini, the RPLidar S1, some HRC-SR04 ultrasonic sensors.
You’d probably want to look into the ROS project (Robot Operating System) because it has a lot of the mapping/navigation code already written, and has been ported to the Jetsons.

That depends on your particular software stack. In ROS you’d survey and build a map, and then draw lines on the map for how the robot should pathfind/move. There are modules in ROS that let the robot temporarily deviate from a path if some particular path is blocked.

This is actually among the harder parts. Probably easiest is to use a wireless inductive charging system, but all the cheap ones are for small things like cell phones. There are some Qi-based systems that go to 40W; they may be sufficient to power a reasonable LiPo battery charger?

The high-level vision software is reasonably easy to migrate – the other Jetsons have a higher shader model support than the Nano.
The low-level peripherals are harder – the GPIO register mapping and other such low-level details are quite different on the Jetson AGX Xavier than on the Nano.
If you only use the high-level driver APIs, then migrating is easier, but you’d still need to re-do the device tree to be able to talk to the right peripherals.

Another option is to use an intermediate microcontroller for the SPI / I2C / UART interfaces, and talk to that microcontroller using the USB-serial protocol from the Nano, and use the Nano entirely for camera work and high-level control. This is the approach I’ve taken with my robots. I use the Teensy microcontroller for I/O, and a Jetson for the higher-level planning and vision.