Google MediaPipe Real-Time Hand Tracking on Nano

I’ve managed to build mediapipe on my nano, and the hand tracking demo works quite well with the gpu. I have not, as of yet, been able to modify the examples since bazel to me is new.

Follow, with the following caveats:

  1. bazel - used 1.2.1 as recommended in the mediapipe docs and followed
  2. I used my own version of opencv, so I followed the instructions for that (modifying WORKSPACE and opencv_linux.BUILD) - hint: new_local_repository goes in WORKSPACE and cc_library in opencv_linux.BUILD
  3. glog - - you can use the config.guess and config.sub linked to at the bottom of and put them in the cache as described in issue 304 (first link]
  4. compiling examples - I just hacked up my /usr/include/EGL/eglplatform.h (after saving the original) to take the correct ifdef path and avoid the conflict between TFLites GPU Status and the X11 Status as discussed in
  5. If I recall correctly, at some point I got an error while compiling one of the examples and I restored my eglplatform.h to its original state. (Sorry, I didn’t take notes because I didn’t really expect it to work)
  6. Run demos - I followed rather than the readme in the examples directory and they mostly ran.

Hope this helps.

1 Like