I’m experiencing a VRAM-related issue while remotely operating the Isaac ROS FoundationPose package.
Initially, I launched both Realsense and FoundationPose together as part of the launch_fragments. However, for remote robot operation, I separated them: Realsense is now launched on the SBC (Single Board Computer) while FoundationPose runs on a desktop.
When I run only the Realsense node on the laptop, VRAM usage remains stable at around 1GB, which is acceptable. However, when I launch FoundationPose on the desktop, the VRAM usage on the laptop spikes significantly, resulting in an Out of Memory error, even though FoundationPose isn’t running on the laptop.
Additionally, I noticed that while only executing the Realsense launch on the SBC, some FoundationPose nodes—such as the reshape-node and resize-mask-node—are being triggered, further contributing to the memory issue. Is there a potential connection or dependency between the Realsense and FoundationPose launches that I may be overlooking? Are there specific areas I should investigate to resolve this?
Thank you for your post, but the information shared is not enough to help you.
Does your device or desktop or laptop meet the system requirements on this page? How do you set up thse? Could you provide more detail how you reproduce this issue?
My goal is to build a remote-controlled robot. Specifically, I want the laptop to publish RealSense topics and receive detection results to control the robot’s movements, while the desktop receives the RealSense topics, processes them, and publishes the detection results.
Currently, I have used an Realsense D455 and an RTX 4070 Ti Super to perform FoundationPose tracking on a desktop, following the Isaac ROS documentation. I ran the FoundationPose tracking process successfully on the desktop.
To apply FoundationPose tracking to a mobile robot, I executed the RealSenseNodeFactorynode and nvidia::isaac_ros::depth_image_proc::ConvertMetricNode on a laptop with only 2GB of VRAM. (Since these two nodes only used 1.1GB of VRAM on the RTX 4070 Ti Super, I assumed they would run efficiently on the laptop as well.)
Indeed, when I only ran these two nodes on the laptop, everything worked as expected. However, the issue occurred when I launched FoundationPose tracking on the desktop, which caused nodes from the FoundationPose tracking launch to also run on the laptop, resulting in an “out of memory” error. Specifically.
I don’t understand why nodes like reshapeNode and ImageToTensorNode from the FoundationPose tracking are being executed on the laptop, which was supposed to only run realsense_mono_rect_depth.
If there’s something I might be misunderstanding—such as confusing how nodes operate, or whether the computations are somehow being shared across devices—I would appreciate any clarification. I’m unsure if I misconfigured the nodes or if there’s some mechanism that’s unintentionally causing node processes to run on both the laptop and desktop.
It sounds like you’re encountering a tricky issue with VRAM while running the Isaac ROS FoundationPose package remotely. From what you described, it seems there may indeed be some underlying dependencies between the Realsense and FoundationPose launches that are causing unnecessary nodes to trigger on the laptop, even though FoundationPose isn’t directly running there.
One thing you could look into is ensuring that any unnecessary nodes, like the reshape-node and resize-mask-node, aren’t being launched on the laptop by accident. It might help to double-check the launch configuration files for both Realsense and FoundationPose to see if there’s any cross-dependency or shared resource that’s causing this.
You could also try isolating the nodes more strictly to prevent FoundationPose nodes from interacting with Realsense, possibly by revisiting how the nodes are mapped or synchronized between the SBC and the desktop.
Additionally, it might be helpful to monitor node interactions using ROS tools to pinpoint where the memory spike is originating. Let me know how it goes or if you discover any further insights!