Online Hand Gesture Recognition with Temporal Shift Module (TSM) on Jetson Nano

In our ICCV’19 paper [1], we propose Temporal Shift Module (TSM), an efficient and light-weight operator for video recognition on edge devices. Here we show that we can build an online hand gesture recognition demo with TSM on Jetson Nano. The model itself can run at 70 FPS on the embedded GPU using only 8 watts.

Here is a recorded video demo:
TSM: Temporal Shift Module for Efficient Video Understanding, online demo with NVIDIA Nano - YouTube

We provide the code and tutorial of the demo here:
Code&Tutorial: temporal-shift-module/online_demo at master · mit-han-lab/temporal-shift-module · GitHub
Paper link: [1811.08383] TSM: Temporal Shift Module for Efficient Video Understanding
Project page: https://tsm-hanlab.mit.edu

Feel free to try it out!

[1] Ji Lin, Chuang Gan, Song Han, TSM: Temporal Shift Module for Efficient Video Understanding, in ICCV’9

This was great! I tried inference in Nano and it worked exactly as you prescribed. Great work indeed!

If I have question on training, where should I send my question? Thanks!

Hi,

You can ask the question on the GitHub repo or just send an email to the author!

Best,
Ji

Hi,
Installing everything on a nano with a jetson sd card image r32.2
when launching /onlinedemo/main.py on python3 here the raise error
CUDAError: Check failed: ret == 0 (-1 vs. 0) : cuModuleLoadData(&(module_[device_id]), data_.c_str()) failed with error: CUDA_ERROR_INVALID_PTX

Please help
Thanks

Even I got the same error like
"CUDAError: Check failed: ret == 0 (-1 vs. 0) : cuModuleLoadData(&(module_[device_id]), data_.c_str()) failed with error: CUDA_ERROR_INVALID_PTX
"
Please help
Thanks

Hi @cassquarellp

See
https://github.com/mit-han-lab/temporal-shift-module/issues/55#issuecomment-578476267 to resolve the issue.
Please make sure to remove all the tvm install module before.
Have fun