Hello everybody ~! 😁
We would like to share the project that we recently made using Jetson Nano with external devices.
In this project, we built Jetson Nano as a presentation rehearsal partner.
Data is collected via JetBot and results are computed via cloud services.
Moreover, our Jetson Nano can display near real-time feedback using colored LEDs during the presentation.
The purpose of this project is to help students rehearse their presentation by recognizing their body language with near real-time feedback and summary reports. The followings are our contributions.
- collect 18 gesture datasets and create the model for gesture classification.
- apply the existing model and API for eye contact detection and facial expression classification, respectively.
- provide near real-time feedback and display to the user while they are presenting in front of the Jetson Nano.
We also provide a showcase video, source code via GitHub, and slide for more detail.
Showcase Video
Source Code
Slides
Feel free to give us any comments.🙏
Best,
Patara and Sakonporn.