Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) Jetson Nano
• DeepStream Version 6.0.1
• JetPack Version (valid for Jetson only) 4.6.1-b110
• TensorRT Version 8.2
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
I have created a dockerfile utilising the 6.0.1 base l4t container. The container runs inference on a video and sends the data to cloud via the kafka azure config. When I enter the container and run in interactive mode using sudo docker run -it --rm --net=host --runtime nvidia -w /opt/nvidia/deepstream/deepstream-6.0 -v /tmp/.X11-unix/:/tmp/.X11-unix containername it and then cd into the correct folder found under the path /opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/apps/iotapp and then run the python script, it works perfectly. However when I try to run the docker container on its own without interactive mode I get the following 2 errors
1)
sudo docker run --rm --net=host --runtime nvidia -w /opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/apps/iotapp -v /tmp/.X11-unix/:/tmp/.X11-unix containername
-c cfg_azure.txt -p libnvds_azure_proto.so -i /opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/videos/video1.h264 --no-display
This gives me an error of “Error: gst-library-error-quark: Could not configure supporting library. (5): /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvmsgbroker/gstnvmsgbroker.cpp(402): legacy_gst_nvmsgbroker_start (): /GstPipeline:pipeline0/GstNvMsgBroker:nvmsg-broker:
unable to connect to broker library”
2)
sudo docker run --rm --net=host --runtime nvidia -w /opt/nvidia/deepstream/deepstream-6.0 -v /tmp/.X11-unix/:/tmp/.X11-unix containername
-c cfg_azure.txt -p libnvds_azure_proto.so -i /opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/videos/video1.h264 --no-display
This gives me an error of “from common.is_aarch_64 import is_aarch64
ModuleNotFoundError: No module named ‘common’”
I am not sure what I am doing wrong
NB:
When i run interactive then I set dockerfile with
CMD [“/bin/bash”]
WORKDIR /opt/nvidia/deepstream/deepstream-6.0
When i switch to entrypoint for script then i use
ENTRYPOINT [ “python3”, “/opt/nvidia/deepstream/deepstream-6.0/deepstream_python_apps/apps/iotapp/main.py”,“-c”, “-p”, “-i”]