Yes.
No more complicated than other robotics projects. If you’ve never built a project, this will be very complicated. If you’ve built other robots, it would be another turn of that crank.
Assuming you get the right sensors and the right actuators and have a good mechanical platform for the robot, the NVIDIA Nano would be a very good choice.
Yes.
[/quote]
What peripherals do I need to start?
[/quote]
You’d want some beefier robot chassis than the JetBot for this to be robust to a commercial environment.
Maybe something like the Turtlebot 2i from Interbotix?
Note that you’d want to replace the computer they ship with, with the Nano.
Another option is to roll your own sensor package, for example based on the Stereolabs Zed or Zed mini, the RPLidar S1, some HRC-SR04 ultrasonic sensors.
You’d probably want to look into the ROS project (Robot Operating System) because it has a lot of the mapping/navigation code already written, and has been ported to the Jetsons.
That depends on your particular software stack. In ROS you’d survey and build a map, and then draw lines on the map for how the robot should pathfind/move. There are modules in ROS that let the robot temporarily deviate from a path if some particular path is blocked.
This is actually among the harder parts. Probably easiest is to use a wireless inductive charging system, but all the cheap ones are for small things like cell phones. There are some Qi-based systems that go to 40W; they may be sufficient to power a reasonable LiPo battery charger?
The high-level vision software is reasonably easy to migrate – the other Jetsons have a higher shader model support than the Nano.
The low-level peripherals are harder – the GPIO register mapping and other such low-level details are quite different on the Jetson AGX Xavier than on the Nano.
If you only use the high-level driver APIs, then migrating is easier, but you’d still need to re-do the device tree to be able to talk to the right peripherals.
Another option is to use an intermediate microcontroller for the SPI / I2C / UART interfaces, and talk to that microcontroller using the USB-serial protocol from the Nano, and use the Nano entirely for camera work and high-level control. This is the approach I’ve taken with my robots. I use the Teensy microcontroller for I/O, and a Jetson for the higher-level planning and vision.