I’m trying to get a freespace DNN to work in my environment (following this tutorial), and need autonomous real-world image freespace labeling because I can’t work with simulation.
I need help with the following problems I’m facing :
I can’t find documentation on isaac::superpixels::RgbdSuperpixelFreespace component, and can’t seem to obtain good freespace segmentation results. I’m especially wondering how to tweak height_angle and height_tolerance parameters, and what’s their role.
From what I understand, this component also needs pose between camera and robot, the default one being [0.270598, -0.653281, 0.653281, -0.270598, 0.0, 0.0, 0.775]. What does this correspond to exactly? How could I tweak this for usage with Kaya?
When I try to run the training script on a prerecorded log containing depth and color image from the Realsense camera, the sample buffer successfully fills up, but I get this error :
Number of samples in sample buffer: 0
Number of samples in sample buffer: 1
2020-06-22 14:48:22.626 ERROR ./messages/image.hpp@74: Image element type does not match: actual=1, expected=2
This doesn’t crash the program but I still would like to fix it.
Can somebody help me with that?
Also I noticed that in
packages/freespace_dnn/apps/freespace_dnn_data_annotation.subgraph.json , value for “label_invalid” parameter was originally 0.2, but this parameter expects an
int, which was also giving me an error.