Jetson Nano + ROS2 Humble + Docker very slow

Hey all,

I’ve resisted posting here but I’m at a point where I don’t know where to go anymore. This is also my first time posting so please let me know what other info I need to provide to make this post better.

I have a Jetson Nano (original) booted with Ubuntu 18.04. I need to run ROS2 Humble, and it seems that Docker containers have worked for other people. The Docker container was noticeably slower but still functional for most basic tasks, but when I tried to start streaming video through ROS I ran into problems. I’m using a pretty simple script like this:

import rclpy
from rclpy.node import Node
from sensor_msgs.msg import Image
from cv_bridge import CvBridge, CvBridgeError
import cv2
import numpy as np

class Camera(Node):
    def __init__(self, cameraport, topic):
        super().__init__('CameraPublisher')
        self.publisher = self.create_publisher(Image, topic,10)
        self.bridge = CvBridge()
        self.cam_feed = cv2.VideoCapture(cameraport) # needs to be called every time the function is run, or else error

    def publish_image(self, topic='image'):
        ret, img = self.cam_feed.read()
        # Initialize the OpenCV bridge for converting between OpenCV images and ROS messages
        # Convert the OpenCV image to a ROS message
        try:
            image_msg = self.bridge.cv2_to_imgmsg(img, "bgr8")
        except CvBridgeError as e:
            rclpy.logerr(e)
            return -1
        
        # Publish the ROS image message
        self.publisher.publish(image_msg)
        # cv2.imshow("publisher",img)
    
def main(args=None):
    rclpy.init(args=args)
    image = Camera(cameraport=0, topic='image')
    while True:
        if image.publish_image() == -1: break
        if (cv2.waitKey(1) & 0xFF == ord("q")) or (cv2.waitKey(1)==27):
            image.destroy_node()
            rclpy.shutdown()
            break
        
        
if __name__ == '__main__':

This script works well on other computers, but will stream with terrible framerate (<5fps) and horrible latency (>1s) when viewed with RVIZ on the Jetson.

I believe that the issue is somewhere within the Docker container. I used this link and followed the instructions to run the container. Is there something I could be missing, or some sort of common error installing these things? My internet research indicated that the docker container should behave very similarly to a native installation, so I’m just a little perplexed as to why this is happening. I’m happy to provide more information on my build if needed.

Thank you for any assistance you can provide.

Hi @xaok7569, other computers meaning x86, or other Jetson devices? My guess is that cv_bridge is not optimized for doing compression using Jetson’s hardware codecs. This video_output node can output encode RTP/RTSP/WebRTC and is in dustynv/ros:humble-pytorch-l4t-r32.7.1 container.

Containers run natively and aren’t inherently slower - that’s not to say something being different during the build process. You can however try using or adapting my ros2_build.sh script to build Humble for you outside of container.

Hey Dusty, thanks so much for the quick reply. I’ve tried running the ros:humble-pytorch container you’ve mentioned and using this command ros2 launch ros_deep_learning video_viewer.ros2.launch input:=v4l2:///dev/video0. This runs well and spits out the following text:

[INFO] [launch]: Default logging verbosity is set to INFO
[INFO] [video_source-1]: process started with pid [140]
[INFO] [video_output-2]: process started with pid [141]
[video_output-2] [INFO] [1706232782.492688042] [video_output]: opening video output: display://0
[video_output-2] [ERROR] [1706232782.493367365] [video_output]: failed to open video output
[video_source-1] [INFO] [1706232782.496821219] [video_source]: opening video source: v4l2:///dev/video0
[video_output-2] [OpenGL] failed to open X11 server connection.
[video_output-2] [OpenGL] failed to create X11 Window.
[INFO] [video_output-2]: process has finished cleanly [pid 141]
[video_output-2] 
[video_source-1] [gstreamer] initialized gstreamer, version 1.14.5.0
[video_source-1] [gstreamer] gstCamera -- attempting to create device v4l2:///dev/video0
[video_source-1] [gstreamer] gstCamera -- found v4l2 device: UVC Camera (046d:081d)
[video_source-1] [gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"UVC\ Camera\ \(046d:081d\)", v4l2.device.bus_info=(string)usb-70090000.xusb-2.3, v4l2.device.version=(uint)264703, v4l2.device.capabilities=(uint)2216689665, v4l2.device.device_caps=(uint)69206017;
[video_source-1] [gstreamer] gstCamera -- found 48 caps for v4l2 device /dev/video0
[video_source-1] [gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)1600, height=(int)1200, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1;
[video_source-1] [gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/2, 5/1 };
[video_source-1] [gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)1024, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/2, 5/1 };
[video_source-1] [gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)1504, height=(int)832, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/2, 5/1 };
[video_source-1] [gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/2, 5/1 };
[video_source-1] [gstreamer] [5] video/x-raw, format=(string)YUY2, width=(int)1392, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [6] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [7] video/x-raw, format=(string)YUY2, width=(int)1184, height=(int)656, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [8] video/x-raw, format=(string)YUY2, width=(int)960, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [9] video/x-raw, format=(string)YUY2, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [10] video/x-raw, format=(string)YUY2, width=(int)960, height=(int)544, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [11] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [12] video/x-raw, format=(string)YUY2, width=(int)864, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [13] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [14] video/x-raw, format=(string)YUY2, width=(int)752, height=(int)416, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [15] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [16] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [17] video/x-raw, format=(string)YUY2, width=(int)544, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [18] video/x-raw, format=(string)YUY2, width=(int)432, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [19] video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [20] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [21] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)176, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [22] video/x-raw, format=(string)YUY2, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [23] video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [24] image/jpeg, width=(int)1600, height=(int)1200, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [25] image/jpeg, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [26] image/jpeg, width=(int)1280, height=(int)1024, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [27] image/jpeg, width=(int)1504, height=(int)832, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [28] image/jpeg, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [29] image/jpeg, width=(int)1392, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [30] image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [31] image/jpeg, width=(int)1184, height=(int)656, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [32] image/jpeg, width=(int)960, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [33] image/jpeg, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [34] image/jpeg, width=(int)960, height=(int)544, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [35] image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [36] image/jpeg, width=(int)864, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [37] image/jpeg, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [38] image/jpeg, width=(int)752, height=(int)416, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [39] image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [40] image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [41] image/jpeg, width=(int)544, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [42] image/jpeg, width=(int)432, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [43] image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [gstreamer] [44] image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[video_source-1] [INFO] [1706232783.321445646] [video_source]: allocated CUDA memory for 1280x720 image conversion

This also creates a ros topic, /video_source/raw. I then try to view this topic on both rqt and rviz, which both allow me to subscribe but only show a blank screen, pushing this error to the terminal:
[rqt_gui_cpp_node_6279]: [image_transport] It looks like you are trying to subscribe directly to a transport-specific image topic '/video_source/raw', in which case you will likely get a connection error. Try subscribing to the base topic '/video_source' instead with parameter ~image_transport set to 'raw' (on the command line, _image_transport:=raw). See http://ros.org/wiki/image_transport for details.

I’ve tried messing around with this today, but I can’t seem to make it work. I know that you recommended the video_output node, but I can’t figure out how to get this to push to a ros topic. I need to use rviz or rqt for my application, so (I believe) I need the image data to be published from a topic.

Is there any advice you can give? This may be a very simple problem but I can’t seem to figure it out.

Hmm, okay gotcha - so normally I would use the RTP/RTSP or WebRTC video streaming to view the video remotely, so it will playback smoothly at HD resolution. I haven’t used rviz but have used foxglove, and the video was slower there too (although I attribute this to being a byproduct of the ROS image transport, as opposed to using encoded protocols that are better for the web).

@rbonghi may be able to better share his experience with ROS and viewing image topics over rviz, or you could try posting to the Isaac ROS forum and see what the consensus is about the best way to do this. Hope that helps!

Hey Dusty, thanks again for your help. I just tried setting up a webRTC stream, and it actually worked really well. We had used RVIZ in the past so we could view video streams next to plots of our robot’s state vs setpoint. There is definitely a workaround to this and I’ve spent long enough trying to get RVIZ to work, so I’m gonna consider this finished. Thanks again!

If anyone hears anything getting a ROS humble + docker +RVIZ stream set up let me know!

OK gotcha, glad you found a workable alternative for now - yea, I understand how RVIZ would be nice to have to view the stream alongside the other topics (although I’ve never gotten a full-res video stream playing smoothly through it, and RVIZ client typically runs on a laptop/PC not the Jetson which gets deployed onto the robot). FWIW I’ve had better luck with foxglove, and it works from any browser client (not requiring Ubuntu), but that also had issues with full-HD video streaming (although I think they’ve been working to address that)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.