I’ve been trying to mesh my own point cloud data using the jupyter notebook code provided in kaolin’s examples material (dmtet_tutorial.ipynb).
The point cloud I’m using comes from a .obj file. I’m just importing the vertices as the reference points to send to the regularizer.
I can see the point cloud using kaolin-dash3d and it seems ok.
The only part of the code I’ve changed was the lines to import the vertices:
vertices = kaolin.io.obj.import_mesh(obj_path).to(device) print(vertices.shape) if vertices.shape > 100000: idx = list(range(vertices.shape)) np.random.shuffle(idx) idx = torch.tensor(idx[:100000], device=vertices.device, dtype=torch.long) vertices = vertices[idx] points = vertices # The reconstructed object needs to be slightly smaller than the grid to get watertight surface after MT. center = (points.max(0) + points.min(0)) / 2 max_l = (points.max(0) - points.min(0)).max() points = ((points - center) / max_l)* 0.9 timelapse.add_pointcloud_batch(category='input', pointcloud_list=[points.cpu()], points_type = "usd_geom_points")
Any other part of the code is exactly the same as the original.
I’m attaching an image with the reconstructed mesh (a candle) after 5000 iterations. I’m also showing the bear example using the same code after the same 5000 iterations.
Any tips why the tessellation is full of holes?