Mediapipe

trying to dockerize
trying with mediapipe/Dockerfile at master · google/mediapipe · GitHub
also with
jetson-containers/Dockerfile.tensorflow at master · dusty-nv/jetson-containers · GitHub
the result is:

FROM nvcr.io/nvidia/l4t-base:r32.4.4
RUN apt-get update -y
RUN apt install -y wget curl git
RUN apt upgrade -y
# RUN cd ~ && mkdir bazel && cd bazel && wget https://github.com/bazelbuild/bazel/releases/download/3.4.0/bazel-3.4.0-dist.zip && sudo apt-get install build-essential openjdk-8-jdk python zip unzip && unzip bazel-3.4.0-dist.zip && env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./compile.sh && cp ~/bazel/output/bazel /usr/local/bin/
WORKDIR /io
WORKDIR /mediapipe
ENV DEBIAN_FRONTEND=noninteractive
ARG HDF5_DIR="/usr/lib/aarch64-linux-gnu/hdf5/serial/"
ARG MAKEFLAGS=-j8

RUN apt-get update && apt-get install -y --no-install-recommends \
        build-essential \
        ca-certificates \
        curl \
        ffmpeg \
        git \
        wget \
        unzip \
        python3-dev \
        python3-opencv \
        python3-pip \
        libopencv-core-dev \
        libopencv-highgui-dev \
        libopencv-imgproc-dev \
        libopencv-video-dev \
        libopencv-calib3d-dev \
        libopencv-features2d-dev \
        software-properties-common && \
    add-apt-repository -y ppa:openjdk-r/ppa && \
    apt-get update && apt-get install -y openjdk-8-jdk && \
    apt-get clean && \
    rm -rf /var/lib/apt/lists/*

RUN pip3 install --upgrade setuptools
RUN pip3 install wheel
RUN pip3 install future
RUN apt update -y
RUN apt-get install  libegl1-mesa-dev  libgles2-mesa-dev libhdf5-serial-dev hdf5-tools libhdf5-dev zlib1g-dev zip libjpeg8-dev liblapack-dev libblas-dev gfortran -y
RUN  pip3 install -U numpy==1.16.1 future==0.18.2 mock==3.0.5 h5py==2.10.0 keras_preprocessing==1.1.1 keras_applications==1.0.8 gast==0.2.2 futures protobuf pybind11

RUN pip3 install six
RUN pip3 install setuptools Cython wheel
RUN pip3 install numpy --verbose
RUN pip3 install h5py==2.10.0 --verbose
RUN pip3 install future==0.18.2 mock==3.0.5 h5py==2.10.0 keras_preprocessing==1.1.1 keras_applications==1.0.8 gast==0.2.2 futures protobuf pybind11 --verbose

#RUN pip3 install --pre --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v44 tensorflow
ARG TENSORFLOW_URL=https://developer.download.nvidia.com/compute/redist/jp/v44/tensorflow/tensorflow-1.15.3+nv20.9-cp36-cp36m-linux_aarch64.whl
ARG TENSORFLOW_WHL=tensorflow-1.15.3+nv20.9-cp36-cp36m-linux_aarch64.whl

RUN wget --quiet --show-progress --progress=bar:force:noscroll --no-check-certificate ${TENSORFLOW_URL} -O ${TENSORFLOW_WHL} && \
    pip3 install ${TENSORFLOW_WHL} --verbose && \
    rm ${TENSORFLOW_WHL}
RUN pip3 install tf_slim

#RUN ln -s /usr/bin/python3 /usr/bin/python

# Install bazel
ARG BAZEL_VERSION=3.7.1

RUN mkdir /bazel && cd /bazel && wget https://github.com/bazelbuild/bazel/releases/download/3.7.1/bazel-3.7.1-dist.zip && apt-get install build-essential openjdk-8-jdk python zip unzip && unzip bazel-3.7.1-dist.zip && env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./compile.sh && cp /bazel/output/bazel /usr/local/bin/



ENV PATH="/usr/local/cuda/bin:${PATH}"
ENV LD_LIBRARY_PATH="/usr/local/cuda/lib64:${LD_LIBRARY_PATH}"
RUN echo "$PATH" && echo "$LD_LIBRARY_PATH"

RUN pip3 install pycuda --verbose
COPY . /mediapipe/
#RUN bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/demo:object_detection_tensorflow_demo
RUN bazel build -c opt --copt -DMESA_EGL_NO_X11_HEADERS --copt -DEGL_NO_X11  mediapipe/examples/desktop/hand_tracking:hand_tracking_gpu --experimental_repo_remote_exec

after some test it seems that therew are issues buil;ding with bazel with given tensorflow inside the container

Thanks Andrey1984 for the details.
We will share this for those want a similar use case.

it is quite difficult with docker.
So far we only have one basic sample of many that work in non-dockerized form on jetson to work from within the container:
@AastaLLL, you may try executing the single command below to see the Hello World sample at jetson:

 docker run -it --rm --net=host   -e DISPLAY=$DISPLAY --ipc=host --privileged -v /tmp/.X11-unix/:/tmp/.X11-unix/ -v /tmp/argus_socket:/tmp/argus_socket --cap-add SYS_PTRACE iad.ocir.io/idso6d7wodhe/jetson-mediapipe:latest /bin/bash -c '.  
GLOG_logtostderr=1 bazel run --copt -DMESA_EGL_NO_X11_HEADERS --copt -DEGL_NO_X11     mediapipe/examples/desktop/hello_world:hello_world --experimental_repo_remote_exec'

reference issues

A basic sample will help someone with the same interest to save lots of time.
Thanks.

@AastaLLL
dockerized or non dockerized sample?
for dockerized there is single command execution for the hello world example mentioned previously

docker run -it --rm --net=host   -e DISPLAY=$DISPLAY --ipc=host --privileged -v /tmp/.X11-unix/:/tmp/.X11-unix/ -v /tmp/argus_socket:/tmp/argus_socket --cap-add SYS_PTRACE iad.ocir.io/idso6d7wodhe/jetson-mediapipe:latest /bin/bash -c '.  
GLOG_logtostderr=1 bazel run --copt -DMESA_EGL_NO_X11_HEADERS --copt -DEGL_NO_X11     mediapipe/examples/desktop/hello_world:hello_world --experimental_repo_remote_exec'

However, for non dockerized use case the entire mediapipe environment needs to be set up s per the instruction steps from the github repository

Hi,

Do you mean the repository from the official mediapipe?

Thanks.

@AastaLLL
Thank you for following up.
Right now the documentation is available through community driven repository of mediapipe:

However, I will try to see if we could add the reference link to the main repo above somehow.
The latter url will include some jetson specific patches that are not addressed in the main repository.

Hi,

Thanks for the information.
Looks forward to seeing the Jetson support in the official repository.

Thanks.

1 Like

I tried to set full path for cuda.h and cuda_runtime.h, but im still getting glog errors:

ERROR: /home/nvidia/mediapipe/mediapipe/examples/desktop/BUILD:59:11: C++ compilation of rule ‘//mediapipe/examples/desktop:demo_run_graph_main_gpu’ failed (Exit 1)
In file included from external/jetson_utils/include/jetson-utils/timespec.h:30:0,
from external/jetson_utils/include/jetson-utils/Event.h:27,
from external/jetson_utils/include/jetson-utils/gstCamera.h:29,
from mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:50:
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc: In function ‘mediapipe::Status RunMPPGraph()’:
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:341:3: note: in expansion of macro ‘LOG’
LOG(INFO) << “Initialize the calculator graph.”;
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:354:3: note: in expansion of macro ‘LOG’
LOG(INFO) << “Initialize the GPU.”;
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:365:3: note: in expansion of macro ‘LOG’
LOG(INFO) << “Initialize the camera or load the video.”;
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:385:3: note: in expansion of macro ‘LOG’
LOG(INFO) << “Start running the calculator graph.”;
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:411:3: note: in expansion of macro ‘LOG’
LOG(INFO) << “Start grabbing and processing frames.”;
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:604:3: note: in expansion of macro ‘LOG’
LOG(INFO) << “Shutting down.”;
^
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc: In function ‘int main(int, char**)’:
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:419:42: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_ERROR google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_ERROR’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:615:5: note: in expansion of macro ‘LOG’
LOG(ERROR) << "Failed to run the graph: " << run_status.message();
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/_virtual_includes/default_glog_headers/glog/logging.h:399:41: error: expected unqualified-id before ‘if’
define COMPACT_GOOGLE_LOG_INFO google::LogMessage(
^
bazel-out/aarch64-opt/bin/external/com_github_glog_glog/virtual_includes/default_glog_headers/glog/logging.h:510:23: note: in expansion of macro ‘COMPACT_GOOGLE_LOG_INFO’
define LOG(severity) COMPACT_GOOGLE_LOG
## severity.stream()
^~~~~~~~~~~~~~~~~~~
mediapipe/examples/desktop/demo_run_graph_main_gpu.cc:618:5: note: in expansion of macro ‘LOG’
LOG(INFO) << “Success!”;
^
Target //mediapipe/examples/desktop/iris_tracking:iris_tracking_gpu failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 17.757s, Critical Path: 16.91s
INFO: 0 processes.
FAILED: Build did NOT complete successfully

I have a question for the moderators, is it possible to run anything other than the examples given in the mediapipe , I was able to compile my personal code using the mediapipe hands-api, is it possible to run on the Jetson-nano.

I’m able to use the Jetson nano to run my own programs using mediapipe. Ofc, my code is based on the examples they provide and is C++ based though and that runs more or less out of the box using the GPU. Getting python examples to run on the GPU instead of CPU requires some work modifying build configs. I tried briefly but without much success.

Thanks for the prompt, I was able to build the mediapipe using bazelisk. Now when I run my Python file, It’s showing no module named mediapipe.
I have created an open issue on the GitHub page of media pipe explaining my problem in detail.
Please check it out

https://github.com/google/mediapipe/issues/1982

it seems that you might need to install mediapipe python module
However, I haven’t used the python yet with it, but it seems to be an installible option that I have had a chance to try implementing

Hey,
I managed to sucessfully build MediaPipe for GPU Hand Tracking in Python.
I had to mix the answers found in Python hand landmark tracking fails with GPU · Issue #1273 · google/mediapipe · GitHub, Mediapipe python interface is not working on GPU · Issue #1651 · google/mediapipe · GitHub in regards to make the Python interface works on the GPU and GitHub - jiuqiant/mediapipe_python_aarch64 for building MediaPipe for the Jetson AGX Xavier.
I detailed all the steps I did in order to succeed in the linked markdown file MEDIAPIPE_GPU_HAND.md. I hope it helps someone not waste countless hours making it work like I had ;) MEDIAPIPE_GPU_HAND.md (5.1 KB)
I wondered if someone actually managed to make use of TensorRT instead of the default Tensorflow Lite to get an even faster inference. I found this issue in GitHub Tensorrt engine instead of tflite for faster inference on nvidia xavier. · Issue #723 · google/mediapipe · GitHub, but I couldn’t make the proposed solution work.

2 Likes

Mediapipe is working fine on my nano using images and videos but won’t work with my csi camera ! I tried streaming using opencv and jetson.utils but it won’t ! any help ?

@ahmedgr914
you may try this patch

 diff -u /home/nvidia/mediapipe/mediapipe/examples/desktop/orig/demo_run_graph_main_gpu.cc /home/nvidia/mediapipe/mediapipe/examples/desktop/demo_run_graph_main_gpu.cc
--- /home/nvidia/mediapipe/mediapipe/examples/desktop/orig/demo_run_graph_main_gpu.cc	2021-04-21 16:31:39.941459722 -0400
+++ /home/nvidia/mediapipe/mediapipe/examples/desktop/demo_run_graph_main_gpu.cc2021-04-21 15:03:30.674384288 -0400
@@ -68,10 +68,11 @@
   LOG(INFO) << "Initialize the camera or load the video.";
   cv::VideoCapture capture;
   const bool load_video = !absl::GetFlag(FLAGS_input_video_path).empty();
-  if (load_video) {
+  const char* gst =  "nvarguscamerasrc ! video/x-raw(memory:NVMM), width=1280, height=720, framerate=30/1 !  nvvidconv ! video/x-raw,format=BGRx ! videoconvert ! video/x-raw, format=BGR ! appsink";
+if (load_video) {
     capture.open(absl::GetFlag(FLAGS_input_video_path));
   } else {
-    capture.open(0);
+    capture.open(gst, cv::CAP_GSTREAMER);
   }
   RET_CHECK(capture.isOpened());
1 Like

@rohitcompany12

you may also reffer as to a reference to GitHub - jiuqiant/mediapipe_python_aarch64 or to MediaPipe in Python - mediapipe

after creating the new demo_run_graph_main_gpu.cc I should redo the installation right ?

was you able to run with CSI after redoing the build of the example?

Thank you for this. I had looked at a lot of these issue pages, but a lot of the information was scattered all over the place, so I must have missed a step somewhere when I tried it before.

1 Like