Error while Running SCRFD in deepstream

@fanzh

  1. I have tested with docker container and still the same error
  2. I have already raised issue in github repo Inference Error Jetson xavier NX · Issue #4 · NNDam/deepstream-face-recognition (github.com)

Thanks for the update.

It it possible to test from you end with jetson device, if possible

Sorry for the late reply, Is this still an DeepStream issue to support? Thanks
Currently I have Xavier + DS6.2.

Yes, if possible could you test this model in your box!!

In the SCRFD repo. the docker building method is for dgpu. please refer to this code.

Those i have already taken care, did you able to run it in jetson

the docker building method is for dgpu. noticing you have test the docker for Jetson. if you have built the docker for Jetson, could you share the docker? Thanks!

FROM nvcr.io/nvidia/deepstream-l4t:6.1.1-triton as ds-6.1.1
LABEL maintainer=“NVIDIA CORPORATION”

Set timezone.

ENV TZ=America/Los_Angeles

Install requried libraries

RUN apt-get update && apt-get install -y software-properties-common
RUN apt-get update && apt-get install -y --no-install-recommends
libcurl4-openssl-dev
wget
curl
zlib1g-dev
git
pkg-config
python3
python3-pip
python3-dev
python3-wheel
g++
libglib2.0-dev
libglib2.0-dev-bin
python-gi-dev
libtool
m4
autoconf
automake
sudo
ssh
pbzip2
pv
bzip2
unzip
cmake
build-essential
libgstreamer1.0-dev
libgstreamer-plugins-base1.0-dev
libgstrtspserver-1.0-dev
libglew-dev
libssl-dev
libopencv-dev
freeglut3-dev
libjpeg-dev
libjson-glib-dev
libcairo2-dev
libpango1.0-dev
libfontconfig1-dev
libfreetype6-dev
libgtk-3-dev
libpng-dev
libgles2-mesa-dev
libegl1-mesa-dev
librabbitmq-dev

RUN pip3 install --upgrade pip
RUN pip3 install setuptools>=41.0.0

RUN wget https://github.com/NVIDIA-AI-IOT/deepstream_python_apps/releases/download/v1.1.4/pyds-1.1.4-py3-none-linux_aarch64.whl
RUN pip3 install pyds-1.1.4-py3-none-linux_aarch64.whl
RUN rm -rf pyds-1.1.4-py3-none-linux_aarch64.whl

Dockerfile (1.2 KB)

  1. Using this dockerfile, which is almost the same with yours. there is an error “no space left on device”
    docker-failed.txt (3.1 KB)
    . could you you share your docker images?
  2. I also tried non-docker method. when executing cmake && make step, there was a error “gtypes.h:32:10: fatal error: glibconfig.h: No such file or directory”. this is not related with DeepStream. could you share how did you fix the compilation errors?
    build-failed.txt (1.1 KB)

@fanzh

  1. I am using custom lenovo se70 box i did not face space issue.
  2. please install this sudo apt-get install libglib2.0-dev

when I execute " LD_PRELOAD=/home/nvnv123/code/deepstream-face-recognition/plugins/nms/build/libmy_plugin.so python3 main_ff.py file:/opt/nvidia/deepstream/deepstream/samples/streams/sample_720p.mp4", there is an error “ERROR: Cannot access ONNX file ‘/home/nvnv123/code/deepstream-face-recognition/src/configs/…/weights/resnet124-dynamic-simply.onnx’”.
log.txt (4.2 KB)
did you modify the code and configuration file? where is this model?

I did not change anything are you referring this repo NNDam/deepstream-face-recognition: Face detection → alignment → feature extraction with deepstream (github.com)

@fanzh any update

sorry for the late reply. I am referring the same repo. there is an error “resnet124-dynamic-simply.onnx”, how did you get this model?

can you share the pipeline and error

I did not modify the code logics, I had shared the log on Sep 14.

can you please remove the SGIE and just run only face detctor.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

after removing sgie, I can get the same error.
ERROR: [TRT]: 2: [pluginV2DynamicExtRunner.cpp::execute::115] Error Code 2: Internal Error (Assertion status == kSTATUS_SUCCESS failed. )
there are many custom code, TensorRT plugin and modifying nvinfer plugin. noticing the docker also has the same erorr, need the author to debug own code because we are not clear the code logics.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.