Depth image to ROS?

Hello, I am trying to send some sensor data from my Isaac sim running in Unity to ROS. So far I have successfully used the RosToImage component to send color camera data to ROS, and display it in Rviz. However, when I attempt the same thing with depth camera data, I get errors. It doesn’t seem like there is any RosToDepthImage component. Any help would be greatly appreciated, thanks!

@orkop465 Was it ImageToRos or RosToImage component?

It was ImageToRos.
Also more context due to lack thereof in my original post:
I am attempting to view depth image data from my Isaac Unity3D sim in ROS, namely Rviz. I am attempting to use ImageToRos, however when I try and send depth image data to it using edges, I receive an error:

2021-03-10 08:10:05.313 ERROR ./messages/image.hpp@74: Image element type does not match: actual=10, expected=1
2021-03-10 08:10:05.313 ERROR external/com_nvidia_isaac_engine/engine/alice/components/Codelet.cpp@229: Component ‘data_to_ros.ros_converters/ImageToRos’ of type ‘isaac::ros_bridge::ImageToRos’ reported FAILURE:

Reading input image from proto buffer failed.

I’m assuming that it’s because the default ImageToRos codelet is not able to support depth images? If that is the case does anyone have a codelet that can support depth images? Or know what to change in ImageToRos to allow this functionality? Thanks!

: You need to use ImageConstView1ui16 instead of ImageConstView3ub as depth image uses encoding “16UC1”

I had gathered that much from some of the other threads, however when I switch it from ImageConstView3ub to ImageConstView1ui16 I am now given a compiler error:

packages/ros_bridge/components/ImageToRos.cpp: In member function ‘virtual bool isaac::ros_bridge::ImageToRos::protoToRos(const isaac::alice::ProtoRx&, const ros::Time&, sensor_msgs::Image&)’:
packages/ros_bridge/components/ImageToRos.cpp:46:96: error: invalid static_cast from type ‘isaac::ImageBase<short unsigned int, 1, isaac::detail::BufferBase<isaac::detail::TaggedPointer<const unsigned char, std::integral_constant<isaac::BufferStorageMode, (isaac::BufferStorageMode)0> > > >::element_ptr_t {aka const short unsigned int*}’ to type ‘const unsigned char*’
unsigned const char* image_ptr = static_cast<unsigned const char*>(image.element_wise_begin());

I apologize in advance if there is an easy fix, my C++ is not too advanced yet. Thanks for the help thus far!!

The static_cast does not like us casting from a pointer to 16-bit unsigned short (const short unsigned int*) to 8-bit unsigned char (const unsigned char*). The issue may be that you need to use reinterpret_cast here after you’ve made sure you’ve accounted for the additional char<->uint16 byte count difference.

Hello, sorry for the delayed reply.
Using reinterpret_cast seemed to fix the compile error, however now I am getting a runtime error:

ERROR ./messages/image.hpp@74: Image element type does not match: actual=10, expected=2

As far as my understanding goes, depth images in ROS also use the image message, however clearly using the image.hpp file in this case doesn’t work due to the differences in normal and depth images.
Any suggestions would be greatly appreciated and I apologize if this issue is taking longer than it should due to any incompetence on my part. Thanks!

The error indicates the actual image type was 32-bit floats and the expected was 16-bit uint when creating an image view from an image proto. In other words, the image message claimed it was a 32-bit float image and you were deserializing the message into an image set to be a 16-bit short. Perhaps the image message sender has set the element type incorrectly or is actually sending a 32-bit float image?

The image message sender is the DepthCamera.cs script included in the Isaac SDK Unity package, and it appears as though it is sending it as a float32. I have since changed this to be a 16-bit uint, and now I am receiving a different error:

2021-04-18 12:16:37.935 PANIC ./messages/image.hpp@104: Image data size does not match. Buffer provides 3686400 bytes while image expected 1843200 bytes.

I noticed that it is exactly double the expected amount which is odd.
Does anyone know where on the Unity side the buffer might be getting doubled in size?

I managed to complete my original goal of displaying a Depth image in Rviz, in the end my solution is as follows:

I created a new DepthToRos cpp and hpp script under packages/ros_bridge/components (and obviously added them to the Bazel build file).
These two files are identical to ImageToRos cpp and hpp, with a few changes:

DepthToRos.cpp:
in ImageToRos the image is a ImageConstView3ub, and in DepthToRos it is ImageConstView1f
The ros_message.step is multiplied by 4, to account for the fact that the image type is 32-bit floats
The if statement checks for 1 channel as opposed to 3, and the ROS encoding is 32FC1
Lastly, as @hemals stated above, reinterpret_cast is used instead of static_cast.

DepthToRos.hpp:
No changes besides node and function names.