Classifying point clouds with CGAL

There exist all sorts of interesting point cloud classification approaches, many of them open source and accessible. A set of particularly interesting ones have been released recently via the Computational Geometry Algorithms Library, or CGAL. The description of the CGAL from their web page is as follows:

CGAL is a software project that provides easy access to efficient and reliable geomeric algorithms in the form of a C++ library. CGAL is used in various areas needing geometric comutation, such as geographic information systems, computer aided design, molecular biology medical imaging, computer graphics, and robotics.

Built into their latest releases are some pretty interesting approaches to quick and accurate point cloud classification which I have been itching to try out (HT Piero Toffanin of Masserano Labs / WebODM founder).

Image of 3 classifications of the same point cloud using CGAL
Image of 3 classifications of the same point cloud using CGAL

I will confess: with my generalist bent for software, I was a little worried whether I could get to the stage of testing the software. What follows is my process, which was surprisingly straightforward.

A picture of a yak in the Himalayas

First we clone the CGAL repository into it’s own subdirectory, then we checkout the appropriate release, set up our build directory, and make and install the whole project:

Make a CGAL subdirectory to place CGAL and the build directory into.

mkdir CGAL
cd CGAL

Clone cgal

git clone https://github.com/CGAL/cgal.git
cd cgal

Checkout the release we want to use (this is currently the latest stable)

git checkout releases/CGAL-4.13
cd ..

Make the build directory

mkdir build
cd build

Now we can configure it all with cmake and then make and install

cmake ../.
make
sudo make install

view raw

clone_cgal.md

hosted with ❤ by GitHub

So far so good. On the CGAL page on classification approaches, they have full code examples for use of the libary. We do a quick search for the code examples in our code base and built them:

Find the classification examples

find . -name example_classification.cpp
cd ./Classification/examples/Classification/

Since there is a makefile here, we can make the examples which takes care of the compilation for us. I use the -j8 flag to compile on 8 cores. Your mileage may vary:

make -j8

What follows is the output we see during compilation.

[  8%] Building CXX object CMakeFiles/example_cluster_classification.dir/example_cluster_classification.cpp.o
[ 16%] Building CXX object CMakeFiles/example_mesh_classification.dir/example_mesh_classification.cpp.o
[ 25%] Building CXX object CMakeFiles/example_ethz_random_forest.dir/example_ethz_random_forest.cpp.o
[ 33%] Building CXX object CMakeFiles/example_generation_and_training.dir/example_generation_and_training.cpp.o
[ 41%] Building CXX object CMakeFiles/example_classification.dir/example_classification.cpp.o
[ 50%] Building CXX object CMakeFiles/example_feature.dir/example_feature.cpp.o
[ 58%] Linking CXX executable example_classification
[ 58%] Built target example_classification
[ 66%] Linking CXX executable example_mesh_classification
[ 66%] Built target example_mesh_classification
[ 75%] Linking CXX executable example_ethz_random_forest
[ 75%] Built target example_ethz_random_forest
[ 83%] Linking CXX executable example_feature
[ 83%] Built target example_feature
[ 91%] Linking CXX executable example_generation_and_training
[ 91%] Built target example_generation_and_training
[100%] Linking CXX executable example_cluster_classification
[100%] Built target example_cluster_classification

Now we can run an example:

./example_ethz_random_forest

Generates the following output

Reading input
Generating features
Done in 1.36523 second(s)
Using ETHZ Random Forest Classifier
Training
Using 2447 inliers
Done in 0.292417 second(s)
Classification with graphcut done in 1.02534 second(s)
Precision, recall, F1 scores and IoU:
 * ground: 1 ; 1 ; 1 ; 1
 * vegetation: 1 ; 1 ; 1 ; 1
 * roof: 1 ; 1 ; 1 ; 1
Accuracy = 1
Mean F1 score = 1
Mean IoU = 1
All done

What did we just do? We took raw data as below:

unclass00

And training data separating out building (pink), ground (tan), and vegetation (green):

training00.png

And thus ran the through CGAL’s best classification approach to get the following classified point cloud:

classified01

Pretty cool! And the results look really good. Next I’ll try it on some drone data, and see how we do.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.