A machine learning approach to derive CAD models from Lidar point clouds
Lidar point clouds provide excellent and quite accurate points in space, including the quite large rooms of factories or ships. Although they are much more accurate than point clouds derived by other means such as regular cameras or structure light systems, the problem remains that it is really difficult to correctly guess and reconstruct the surfaces from which these points have been sampled. It is typical to first guess and construct meshes from the point clouds, giving a potentially quite uneven structure where a flat wall or round pillar is supposed to be. Guessing whether such unevenness is true or an artifact of an inaccuracy is not impossible because there is a branch of computer vision called photogrammetry, which estimates surface curvature from color gradients. However, this still leaves the problem of cleanly terminating every surface by its edges, which may be sharp, rounded or ragged. Edge detection for sharp edges can be implemented by a well-known computer vision algorithm called the Hough Transform. By putting all these pieces together, a human can receive a lot of help in creating a CAD model from some images and a point cloud.
Velodyne VLP-16 "Puck" LiDAR
What we want to find out is whether machine learning can be trained to create an appropriate CAD model from only the Lidar point cloud, or a Lidar point cloud with some images, if it has been trained with CAD reconstructions that have been created by humans using all the methods mentioned above.
The first step towards this is to create a practical, physical system for the creation of ground truth data that can be used in training the machine learning algorithm. This system must support humans in creating the CAD models from points clouds and images. Based on this, we create ground truth data from a variety of indoor scenes with known surfaces, we choose a machine learning framework and evaluate whether it can be train to correctly recognize flat surfaces within the controlled ground truth data.