Google MediaPipe Real-Time Hand Tracking on Nano

I’ve managed to build mediapipe on my nano, and the hand tracking demo works quite well with the gpu. I have not, as of yet, been able to modify the examples since bazel to me is new.

Follow https://github.com/google/mediapipe/blob/master/mediapipe/docs/install.md#installing-on-debian-and-ubuntu, with the following caveats:

  1. bazel - used 1.2.1 as recommended in the mediapipe docs and followed https://docs.bazel.build/versions/master/install-compile-source.html#bootstrap-bazel
  2. I used my own version of opencv, so I followed the instructions for that (modifying WORKSPACE and opencv_linux.BUILD) - hint: new_local_repository goes in WORKSPACE and cc_library in opencv_linux.BUILD
  3. glog - https://github.com/google/mediapipe/issues/304 - you can use the config.guess and config.sub linked to at the bottom of https://github.com/google/mediapipe/issues/470 and put them in the cache as described in issue 304 (first link]
  4. compiling examples - I just hacked up my /usr/include/EGL/eglplatform.h (after saving the original) to take the correct ifdef path and avoid the conflict between TFLites GPU Status and the X11 Status as discussed in https://github.com/google/mediapipe/issues/305
  5. If I recall correctly, at some point I got an error while compiling one of the examples and I restored my eglplatform.h to its original state. (Sorry, I didn’t take notes because I didn’t really expect it to work)
  6. Run demos - I followed https://github.com/google/mediapipe/blob/master/mediapipe/docs/examples.md rather than the readme in the examples directory and they mostly ran.

Hope this helps.

1 Like