Polygon annotation with instance_segmentation or secmantic_segmentation

Hi everyone,

I am currently using Isaac Sim to create synthetic data for training YOLOv8 or SAM. I was able to generate synthetic data with bounding boxes and convert it to YOLO format. However, when I attempt to use instance segmentation or semantic segmentation to obtain polygon annotations, the raw data I get is not the points that cover the object but just the color code of the segmented object. How can I create synthetic data with polygon annotations by using segmentation method?

Segmentation annotations are not output as polygons by Isaac Sim, but as masks. You will have to postprocess the masks extracting the polygons for every class or object that is painted in the segmentation mask. For example, you could use the following:

from skimage import measure

def mask_to_polygon(seg_map, seg_id):
    # Get the mask for the current object
    mask = seg_map == seg_id

    # Compute contours
    contours = measure.find_contours(mask)
    contours = map(np.squeeze, contours)

    # filter singletons
    contours = filter(lambda x: len(x.shape) >= 2 and x.shape[0] != 2, contours)

    polygon = []

    # Append coordinates for each contour in the polygon to a list
    for contour in list(contours):
        coordinates = []
        for (x, y) in contour:
            coordinates.append(int(y))
            coordinates.append(int(x))
        polygon.append(coordinates)
    return polygon

Thank you for your answer!

I have another question: When an object is occluded by other objects, meaning they are overlapping, how can I extract just the non-occluded object?

Please post this in a new topic, clearly specifying your input specs. and the expected output (segmentation, bounding boxes, etc.). Thank you.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.