Cmake cannot find VPI after isaac ros 2.1 upgrade

Hello,

I have been using Isaac ROS (via the Isaac ROS docker container run through isaac_ros_common/scripts/run_dev.sh on an Orin Nano 8GB developer kit) for many months now and things were working well. However, I recently downloaded a package I hadn’t used before (isaac_ros_visual_slam) and colcon failed to build it due to errors with unrecognised/incorrect VPI functions in the cpp files, which lead me to learn that many of the packages now use new VPI functions from VPI 2.3 as part of Isaac ROS 2.1. To try and fix this issue I:

  • Upgraded my Orin Nano’s Jetpack version to 5.1.2 using these instructions: How to Install JetPack :: NVIDIA JetPack Documentation
  • This resulted in VPI 2.3.9 being successfully built on my Orin Nano (I think?), dpkg -l shows vpi2-demos vpi2-dev vpi2-samples libnvvpi2 python3.8-vpi2 and python3.9-vpi2 all version 2.3.9.
  • Cleared my docker storage using “docker system prune -a -f” (as I had not rebuilt the docker image since before Isaac ROS 2.0), then setup the Docker Engine again (Compute Setup — isaac_ros_docs documentation) and the Nvidia Container Toolkit (Installing the NVIDIA Container Toolkit — NVIDIA Container Toolkit 1.14.3 documentation) because nvidia-container-runtime was depreciated (GitHub - NVIDIA/nvidia-container-runtime: NVIDIA container runtime)
  • Re-downloaded the Isaac ROS repositories that I wanted to use (isaac_ros_common, isaac_ros_nitros, isaac_ros_image_pipeline) into my workspace volume so I would be using the Isaac ROS 2.1 versions
  • Rebuilt the aarch64.ros2_humble docker image using isaac_ros_commons/scripts/run_dev.sh, which seemed to work well and seemed to have GPU access (CMake could find CUDA and TensorRT, python libraries e.g. torch could find CUDA…)
  • Confirmed that the volume mount docker args for VPI in run_dev.sh worked: /opt/nvidia/vpi2 and /usr/share/vpi2 are successfully mounted into the container from the host operating system and contain VPI files labelled as being for version 2.3.9

However it seems that VPI has not been properly installed in the container. dpkg -l only shows vpi2-dev and libnvvpi2 (both versions 2.2.4, unlike outside the container on the Orin Nano host machine), and jtop also lists VPI as being 2.2.4 inside the container but 2.3.9 outside the container on the host machine. Furthermore, colcon build (inside the container) fails on isaac_ros_common with the following error as CMake cannot find any VPI version (this is different to the initial problem from before upgrading Jetpack and Isaac ROS where CMake could find VPI but the VPI functions were not up-to-date with the C++ code in the Isaac ROS packages).

Starting >>> isaac_ros_common
--- stderr: isaac_ros_common                         
CMake Error at CMakeLists.txt:31 (find_package):
  By not providing "Findvpi.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "vpi", but
  CMake did not find one.

  Could not find a package configuration file provided by "vpi" with any of
  the following names:

    vpiConfig.cmake
    vpi-config.cmake

  Add the installation prefix of "vpi" to CMAKE_PREFIX_PATH or set "vpi_DIR"
  to a directory containing one of the above files.  If "vpi" provides a
  separate development package or SDK, be sure it has been installed.


make: *** [Makefile:267: cmake_check_build_system] Error 1
---
Failed   <<< isaac_ros_common [1.93s, exited with code 2]

I would appreciate any advice on how I can successfully install VPI 2.3.x in the Isaac ROS container in a way that makes it visible to CMake, or any advice in general as to how I should be using Isaac ROS.

Hi @leo1612 ,

welcome here :-)
Thank you for this detailed post.

Simple question: did you clean the old build installs from your previous install?

You can find these folders:

  • build
  • install
  • log

in: $HOME/workspaces/isaac_ros-dev

You can simply cancel with the following:

rm -R $HOME/workspaces/isaac_ros-dev/build $HOME/workspaces/isaac_ros-dev/install $HOME/workspaces/isaac_ros-dev/log

(use superuser)

Best,
Raffaello

Hi @Raffaello , thanks for the reply, yes I had deleted my build install and log directories before running colcon build. I was able to fix this issue by completely re-flashing my Orin Nano with the SDKmanager, and now I can successfully build the packages in isaac_ros_common, isaac_ros_nitros, and other packages such as isaac_ros_visual_slam and isaac_ros_object_detection. However, I am still having a VPI-related issue with isaac_ros_image_proc specifically (all the other packages in isaac_ros_image_pipeline can build successfully), the error seems to be related to cupva which is strange since the Orin Nano does not have PVA cores:

Starting >>> isaac_ros_image_proc
[Processing: isaac_ros_image_proc]                            
[Processing: isaac_ros_image_proc]                                    
[Processing: isaac_ros_image_proc]                                       
[Processing: isaac_ros_image_proc]                                       
[Processing: isaac_ros_image_proc]                                       
[Processing: isaac_ros_image_proc]                                       
[Processing: isaac_ros_image_proc]                                       
[Processing: isaac_ros_image_proc]                                       
[Processing: isaac_ros_image_proc]                                       
--- stderr: isaac_ros_image_proc                                          
/usr/bin/ld: warning: libcupva_host.so.2.3, needed by /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9, not found (try using -rpath or -rpath-link)
/usr/bin/ld: warning: libcupva_host_utils.so.2.3, needed by /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9, not found (try using -rpath or -rpath-link)
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::granularity(cupva::GranType)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::dstDim1(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::ConfigDataFlow::id() const'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::srcImpl(void const*, int, int, int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Fence::timestamp() const'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Fence::Fence(cupva::SyncObj&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::roi(int, int, int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdWaitOnFences::~CmdWaitOnFences()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Fence::wait(long) const'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::dstImpl(void*, int, int, int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Executable::~Executable()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::handler(cupva::Parameter const&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Context::~Context()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::SyncObj::Create(bool, cupva::SyncClientType, cupva::SyncWaitMode)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::CmdProgram()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Stream::~Stream()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::Node::bpp(int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::link(cupva::RasterDataFlow&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Exception::getErrorCode() const'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::~DynamicDataFlow()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Context::Context()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::tile(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdWaitOnFences::CmdWaitOnFences(cupva::Fence const&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::GetHardwareInfo()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::dstLinePitch(int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::Create(cupva::Executable const&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::handler(cupva::Parameter const&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Parameter::setValuePointer(void*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::ConfigDataFlow::~ConfigDataFlow()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::operator=(cupva::CmdProgram&&) &'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::padVal(cupva::PadModeType, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::halo(int, int, cupva::PadModeType, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Stream::submit(std::initializer_list<cupva::BaseCmd const*> const&, cupva::impl::CmdStatus**, cupva::OrderType, int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::compileDataFlows()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::Create()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Parameter::setValueArray(void const*, long)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::mem::Alloc(long, cupva::mem::AccessType, cupva::mem::AllocType)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Executable::Executable(cupva::Executable&&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Context::Create(unsigned int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::~CmdProgram()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::src(void const*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::tile(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::Create()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::at(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::mem::GetSurfaceAttributes(void const*, cupva::mem::SurfaceAttributes&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::srcCircularBuffer(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdRequestFences::~CmdRequestFences()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::srcLinePitch(int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::ConfigDataFlow::Create()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Stream::submit(cupva::BaseCmd const* const*, cupva::impl::CmdStatus**, int, cupva::OrderType, int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Context::GetCurrent()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::tileBufferImpl(void*, void*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Parameter::Parameter()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::registerDataFlowHead(cupva::BaseDataFlow&&, int, float)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Parameter::getDevicePointer() const'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::ConfigDataFlow::linkInternal(cupva::BaseDataFlow&, bool)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Fence::~Fence()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::Node::src(void const*, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::registerDataFlow(cupva::BaseDataFlow&&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::dstCircularBuffer(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Parameter::~Parameter()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva_utils::AllocSurface(cupva_utils::PlaneSize const*, int, cupva::SurfaceFormatType)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::Node::dst(void*, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::~RasterDataFlow()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::dstDim2(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdRequestFences::CmdRequestFences(cupva::Fence&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Context::operator=(cupva::Context&&) &'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::ConfigDataFlow::handler(cupva::Parameter const&)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::SyncObj::~SyncObj()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Context::SetCurrent(cupva::impl::Context*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::dst(void*, void*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::mem::Free(void*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::Create()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Executable::Create(void const*, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::mem::GetHostPointer(void*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::Node::tile(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::DynamicDataFlow::init(cupva::Parameter const&, int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::CmdProgram::operator[](char const*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::linkInternal(cupva::BaseDataFlow&, bool)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::RasterDataFlow::halo(int, cupva::PadModeType, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::padDim(cupva::PadDirType, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::~StaticDataFlow()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::bpp(int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::ConfigDataFlow::src(void const*)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::srcDim1(int, int)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::id() const'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Parameter::setValueScalar(void const*, long)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::Create()'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::Stream::Create(cupva::EngineType, cupva::AffinityType)'
/usr/bin/ld: /opt/nvidia/vpi2/lib/aarch64-linux-gnu/libnvvpi.so.2.3.9: undefined reference to `cupva::StaticDataFlow::srcDim2(int, int)'
collect2: error: ld returned 1 exit status
make[2]: *** [CMakeFiles/isaac_ros_image_proc.dir/build.make:367: isaac_ros_image_proc] Error 1
make[1]: *** [CMakeFiles/Makefile2:310: CMakeFiles/isaac_ros_image_proc.dir/all] Error 2
make: *** [Makefile:146: all] Error 2
---
Failed   <<< isaac_ros_image_proc [4min 54s, exited with code 2]

Summary: 0 packages finished [4min 55s]
  1 package failed: isaac_ros_image_proc
  1 package had stderr output: isaac_ros_image_proc

I would appreciate any advice on how to stop VPI from trying to use cupva, or any indication as to what is causing the problem with the most recent version of isaac_ros_image_pipeline for Isaac ROS 2.1 on a Jetson Orin Nano.

Hi @leo1612

After cross-checking your previous post, I am currently investigating internally. If you also have an AGX Orin, can you confirm whether you obtain the same output? If not, don’t worry.

I keep you posted,
Raffaello

Hi @Raffaello ,

Thanks for going to the effort of investigating this problem, unfortunately I do not have an AGX Orin, maybe I will consider buying one as Isaac ROS is clearly tested on and designed for the AGX Orin. I have also had a seemingly related issue when trying to run any nitros-accelerated nodes in a ComposableNodeContainer (using the component_container_mt executable) - this error message is just an example of a few Isaac ROS nodes I tried to launch in a ComposableNodeContainer.

[INFO] [launch]: All log files can be found below /home/admin/.ros/log/2023-12-01-11-15-43-217168-ubuntu-73879
[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [component_container_mt-1]: process started with pid [73906]
[component_container_mt-1] [INFO] [1701389744.401675037] [nitros.nitros_container]: Load Library: /workspaces/isaac_ros-dev/install/isaac_ros_stereo_image_proc/lib/libdisparity_node.so
[component_container_mt-1] [ERROR] [1701389744.424951626] [nitros.nitros_container]: Failed to load library: Could not load library dlopen error: libcupva_host.so.2.3: cannot open shared object file: No such file or directory, at /opt/ros/humble/src/rcutils/src/shared_library.c:99
[ERROR] [launch_ros.actions.load_composable_nodes]: Failed to load node 'disparity' of type 'nvidia::isaac_ros::stereo_image_proc::DisparityNode' in container '/nitros/nitros_container': Failed to load library: Could not load library dlopen error: libcupva_host.so.2.3: cannot open shared object file: No such file or directory, at /opt/ros/humble/src/rcutils/src/shared_library.c:99
[component_container_mt-1] [INFO] [1701389744.429081135] [nitros.nitros_container]: Load Library: /workspaces/isaac_ros-dev/install/isaac_ros_stereo_image_proc/lib/libpoint_cloud_node.so
[component_container_mt-1] [ERROR] [1701389744.443297708] [nitros.nitros_container]: Failed to load library: Could not load library dlopen error: libcupva_host.so.2.3: cannot open shared object file: No such file or directory, at /opt/ros/humble/src/rcutils/src/shared_library.c:99
[ERROR] [launch_ros.actions.load_composable_nodes]: Failed to load node 'point_cloud_node' of type 'nvidia::isaac_ros::stereo_image_proc::PointCloudNode' in container '/nitros/nitros_container': Failed to load library: Could not load library dlopen error: libcupva_host.so.2.3: cannot open shared object file: No such file or directory, at /opt/ros/humble/src/rcutils/src/shared_library.c:99

I just showed the disparity node and the pointcloud node in this error, but this happens for all nodes that use nitros, including a custom managed nitros publisher node that I tried to make.

I also noticed that I have /opt/nvidia/cupva-2.3 installed on my operating system (containing the shared object files that Isaac ROS cannot find), I have tried mounting it as a volume into the Isaac ROS docker container but this doesn’t fix the issue. Is there any workaround I can do to enable VPI and Nitros to find these cupva shared object files within the docker container, or will I need to wait for the next Isaac ROS update?

We haven’t seen this before running on a device so I can’t say for sure how we got into this state, but let’s try adding the path to your mounted /opt/nvidia/cupva-2.3 in the container to LD_LIBRARY_PATH and then trying again. nvidia-container-toolkit should have mounted that directory for you but it looks like you needed to mount it manually which is also unexpected. Fixing the LD_LIBRARY_PATH should get you into a working state hopefully and then we can backtrack as to what’s different.