Hello AI World - Meet Jetson Nano
Date: Thursday, May 2, 2019
Time: 10:00 – 11:00 a.m. PT
Host: Dustin Franklin, Jetson Developer Evangelist, NVIDIA
Find out more about the hardware and software behind Jetson Nano, and see how you can create and deploy your own deep learning models along with building autonomous robots and smart devices powered by AI. You’ll discover new features and various tips & tricks for using the devkit.
By attending this webinar, you’ll learn how to:
- Use popular machine learning frameworks such as TensorFlow, PyTorch, Caffe, and MxNet for running a wide variety of deep neural network models. These enable tasks like image recognition and object detection, accelerated with support from NVIDIA JetPack, cuDNN, and TensorRT.
- Develop in a full desktop Ubuntu environment with popular programming languages and libraries like Python, C++, CUDA X, OpenGL, and ROS (Robot OS) on Jetson Nano.
- Benchmark the latest compute platforms for AI inferencing performance and get started with deep learning tutorials like Hello AI World and Two Days to a Demo.
- Connect your Nano with off-the-shelf sensors, cameras, and peripherals for streaming live HD video and controlling actuators with GPIO.
- Build reference robotics kits, including NVIDIA JetBot and Kaya.
AI for Makers - Learn with JetBot
Date: Thursday, May 16, 2019
Time: 10:00 – 11:00 a.m. PT
Host: John Welsh, Developer Technology Engineer, NVIDIA
Want to take your next project to a whole new level with AI? JetBot is an open source DIY robotics kit that demonstrates how easy it is to use Jetson Nano to build new AI projects. See JetBot in action and get started creating your own AI-powered robots and autonomous machines.
Join the webinar to explore:
- New Jetson features like the Python GPIO library
- The anatomy of a typical AI project using Jetson Nano
- How to train a neural network for a custom task, like avoiding collisions with JetBot
- How to easily perform real-time object detection with Jetson Nano
- How to move beyond JetBot and create your own project
Will recordings be available afterward?
Hi samarin, yes the recordings will become available through the same links as above roughly one day after, and the slides will be posted directly afterwards.
Thanks to everyone who joined us today! The recording will be available for streaming On-Demand shortly.
And here’s a link to the slides — https://github.com/dusty-nv/jetson-presentations/raw/master/20190502_Jetson_Nano_Hello_AI_World.pdf
Will there be a board on this forum specifically for Jetbot questions, or should they be posted as github issues?
In other word, is there a place in this forum for totally noob questions, thanks…
Hi there, the JetBot devs will see your questions faster if you post them as issues on the JetBot GitHub, thanks.
The 160 degree cameras are getting hard to find on Amazon. I managed to find an imx219 based camera module from Waveshare on Amazon (instead of MakerFocus). It says it works with the Pi Camera Board V2. Am I probably fine with it?
I bought the Waveshare IMX219 Camera Module listed on Amazon and it works just fine.
[ 3.385454] imx219 6-0010: tegracam sensor driver:imx219_v2.0.6 [ 4.957614] vi 54080000.vi: subdev imx219 6-0010 bound
The module from Waveshare just replaces the non-wide-angle component of the full camera as shown in this image https://www.waveshare.com/img/devkit/accBoard/IMX219-D160/IMX219-D160-install.jpg
The only other cheap alternative is the LI-IMX219-MIPI-FF-NANO which is a complete unit, i.e. you don’t also need an original Raspberry Pi camera module. However anytime I’ve looked, the 145° variant has been out of stock.
Thanks for making sure, George. As it happens I have a tested v2 board lying around with the stock camera module on it, so I think I’m good to go as far as parts. I just have to find a the nearest makerspace to print the stl.
Thanks for the confirmation!
I wasn’t able to access a 3d printer at the local makerspace on Saturday since the one that didn’t require reservations shut down, so I decided to cannibalize an old snes-shaped Pi enclosure that happens to fit a Nano inside. I had to zip-tie the battery in place underneath, but everything fits and the wheels are attached. Today has been brushing up on my solding skills, practice, and mounting all the pieces.
After much soldering, all the electrical connections are there, with the exception of: To save money, I instead of ordering a new pi-oled display, I decided to use a spare one I already had. Issue is it lacks what I now see is a header that must be included with the version on the JetBot bill of materials… but it has, vcc, gnd, scl, and sda. I guess I have to bridge the connections as shown in this image of step 12:
My jet bot is as follows:
*redacted image as accidentally posted mac address of intel nic.
My questions are:
I am pretty sure I get what is supposed to be connected where, but all the photos of the actual connections to the Nano seem to be obscured. I gather 3v goes to 3v and gnd to gnd, but which pins would scl and sda go to on the nano (yellow and orange, respectively)? I don’t want to break anthing. My soldering is decent but my knowledge of electrical enegineering is… basic.
The resolution and screen layout on my oled display is probably different. I’m not expecting it to display properly without tweaking the source, but I won’t actually damage anything by trying to drive a display at an unsupported resolution, will I? Here is the exact oled display I have.
For connecting the OLED Display, the best bet is to look at the pinout on the J41 header where the OLED is connected.
The following link has a pretty good diag of the pinout.
However, there is code that runs at startup with the JetBot image where it looks for the PiOLED at i2C address 0x3C to display the IP Address and other data. If the OLED you are using does not have the same i2C address as the PiOLED, then you will need to change this or it will complain about it not being connected.
Hi mdegans - the I2C pins are the ones right above the 3V output pin.
You can see the pinout for the GPIO header here on JetsonHacks. And you can fairly easily match up the image shown there with the silkscreening on the board itself.
As to the OLED - I’m impressed in a bad way :) There are absolutely no details on the Amazon page about datasheets and libraries. But clearly, others have got it to work. So most probably it’s an SSD1306 based device - (like this one here that Adafruit demonstrate working with a Pi).
Even if it’s not the expected kind of OLED device, I don’t think you’ll break it if you wire it up properly (i.e. SDA, SCL, GND and VCC to the appropriate header pins). The worst that will probably happen is that it simply won’t work.
As always be careful wiring things up - you don’t want magic smoke escaping suddenly! :) But don’t be so cautious that it puts you off trying at all. Good luck and have fun.
As always there’s no warranty provided with this advise.
Thanks. I was afraid of something like that.
1. Will it boot, or will it gets stuck or something so I’ll need to modify the files using a microsd reader? Nevermind. I’m going to do the “create from scratch” instructions and set up ssh.
- Location of file and line number to change the i2c address?
Understood. So far, no magic smoke ;)
Thanks for your help! I know where to come if I have any more questions.
The bot should boot fine without the PiOLED. You’ll just see errors regarding the i2c bus since there is a script that is looking for the PiOLED. I am not sure if building from scratch will resolve the issue though.
I am not sure where the code is that controls the OLED display at startup, but I am fairly certain it is a startup script somewhere. Perhaps the NVIDIA folks could provide some insight into this. This would be handy to have for those looking to use an alternate OLED or use the display for something else. Perhaps file an enhancement for this.
Thanks. I think i’m managing :)
I found it after following the from scratch instructions. create_stats_service.py and create_jupyter_service.py create .service files for systemd.
The systemd then runs
python3 -m jetbot.apps.stats (for the display) and
jupyter lab --ip=0.0.0.0 --no-browser which would appear to be the jupyter for control.
disp = Adafruit_SSD1306.SSD1306_128_32(rst=None, i2c_bus=1, gpio=1) # setting gpio to 1 is hack to avoid platform detection
So it’s this file (and hopefully only this file) that will need to be modified on my install. I haven’t read through all the source yet.
● jetbot_jupyter.service - Jupyter Notebook Service Loaded: loaded (/etc/systemd/system/jetbot_jupyter.service; disabled; vendor preset: enabled) Active: active (running) since Mon 2019-05-13 15:51:09 PDT; 10s ago Main PID: 13789 (jupyter-lab) Tasks: 1 (limit: 4190) CGroup: /system.slice/jetbot_jupyter.service └─13789 /home/mario/kart/venv/bin/python3 /home/mario/kart/venv/bin/jupyter-lab --ip=0.0.0.0 --no-browser
I can connect and open the notebooks and everything. I haven’t started the stats ‘daemon’ yet.
If people were wondering if another oled display works, yes, it may (did in my case) but it may not physically fit in your case without modification if you 3d printed.
● jetbot_stats.service - JetBot stats display service Loaded: loaded (/etc/systemd/system/jetbot_stats.service; disabled; vendor preset: enabled) Active: active (running) since Mon 2019-05-13 17:20:57 PDT; 23s ago Main PID: 6171 (python3) Tasks: 1 (limit: 4190) CGroup: /system.slice/jetbot_stats.service └─6171 python3 -m jetbot.apps.stats May 13 17:20:57 snesbot systemd: Started JetBot stats display service.