Smathermather's Weblog

Remote Sensing, GIS, Ecology, and Oddball Techniques

Archive for the ‘Other’ Category

Taking Slices from LiDAR data: Part IX

Posted by smathermather on February 20, 2017

Part 9 of N… , see e.g. my previous post on the topic.

We’ve been working to reduce the effect of overlapping samples on statistics we run on LiDAR data, and to do so, we’ve been using PDAL’s filters.sample approach. One catch: this handles the horizontal sampling problem well, but we might want to intentionally retain samples from high locations — after all, I want to see the trees for the forest and vice versa. So, it might behoove us to sample within each of our desired height classes to retain as much vertical information as possible.

Posted in 3D, Database, Docker, LiDAR, Other, PDAL, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | Leave a Comment »

Taking Slices from LiDAR data: Part VIII

Posted by smathermather on February 18, 2017

Part 8 of N… , see e.g. my previous post on the topic.

I didn’t think my explanation of sampling problems with LiDAR data in my previous post was adequate. Here are a couple more figures for clarification.

We can take this dataset over trees, water, fences, and buildings that is heavily sampled in some areas and sparsely sampled in others and use PDAL’s filters.sample (Poisson dart-throwing) to create an evenly sampled version of the dataset.

Figure showing overlap of LiDAR scanlines

Figure showing overlap of LiDAR scanlines

Figure showing data resampled for eveness

Figure showing data resampled for evenness

An extra special thanks to the PDAL team for not only building such cool software, but being so responsive to questions!

Posted in 3D, Database, Docker, LiDAR, Other, PDAL, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | Leave a Comment »

Taking Slices from LiDAR data: Part VII

Posted by smathermather on February 15, 2017

Part 7 of N… , see e.g. my previous post on the topic.

More work on taking LiDAR slices. This time, the blog post is all about data preparation. LiDAR data, in its raw form, often has scan line effects when we look at density of points.

lidar_flightlines

This can affect statistics we are running, as our sampling effort is not even. To ameliorate this affect a bit, we can decimate our point cloud before doing further work with it. In PDAL, we have three choices for decimation: filters.decimation, which samples every Nth point from the point cloud; filters.voxelgrid, which does volumetric pixel based resampling; and filters.sample or “Poisson sampling via ‘Dart Throwing'”.

filters.decimation won’t help us with the above problem. Voxelgrid sampling could help, but it’s very regular, so I reject this on beauty grounds alone. This leaves filters.sample.

The nice thing about both the voxelgrid and the poisson sampling is that they retain much of the shape of the point cloud while down sampling the data:

subsample-ex1

subsample-ex2

We will execute the poisson sampling in PDAL. As many things in PDAL are best done with a (json) pipeline file, we construct a pipeline file describing the filtering we want to do, and then call that from the command line:

We can slice our data up similar to previous posts, and then look at the point density per slice. R-code for doing this forthcoming (thanks to Chris Tracey at Western Pennsylvania Conservancy and the LidR project), but below is a graphic as a teaser. For the record, we will probably pursue a fully PDAL solution in the end, but really interesting results in the interim:

image001

More to come. Stay tuned.

Posted in 3D, Database, Docker, LiDAR, Other, PDAL, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | 2 Comments »

On Generalization Blending for Shaded Relief

Posted by smathermather on January 29, 2017

somethingaboutmaps

I have nearly recovered sufficiently from an amazing NACIS conference, and I think I’m ready to get back to a little blogging. This time around, I’d like to present you all with an unfinished concept, and to ask you for your help in carrying it to completion. Specifically, I’d like to show you some attempts I’ve made at improving digital hillshades (I’ll be randomly switching terminology between ‘hillshade’ and ‘shaded relief’ throughout).

Automated, straight-out-of-your-GIS hillshades are usually terrible, and it generally takes some extra cleanup work to get them to the point where they aren’t embarrassing or won’t burst into flames simply by being put next to a well-executed manual shaded relief. Here’s an example I stole from shadedreliefarchive.com which illustrates the problem:

The computer doesn’t see the big picture — that every little bump in elevation can sum to a large mountain, or that some bumps are more critical…

View original post 1,411 more words

Posted in Other | Leave a Comment »

FOSS4G 2018: Dar es Salaam

Posted by smathermather on January 10, 2017

This will be a different FOSS4G. As the first in a developing country, our vision is for an accessible and inclusive FOSS4G, a FOSS4G for All.

Source: FOSS4G 2018: Dar es Salaam

Posted in Other | Leave a Comment »

Filtering buildings and vegetation from drone-derived surface models

Posted by smathermather on October 21, 2016

I’ve been playing for the last few days with PDAL and it’s ability to filter point clouds. More code and details later, but here’s a teaser of my results to date:

Image of unfiltered surface model.

Image of unfiltered surface model.

Image of unfiltered surface model

Image of unfiltered surface model.

Image of surface model derived from points filtered using approximate progressive morphological filter.

Image of surface model derived from points filtered using approximate progressive morphological filter.

Image of surface model derived from points filtered using approximate progressive morphological filter.

Image of surface model derived from points filtered using progressive morphological filter.

Image of surface model derived from points filtered using progressive morphological filter.

Image of surface model derived from points filtered using progressive morphological filter.

Image of surface model derived from points filtered using progressive morphological filter.

Image of surface model derived from points filtered using progressive morphological filter plus building footprint and outlier removal.

Image of surface model derived from points filtered using progressive morphological filter plus building footprint and outlier removal.

Image of surface model derived from points filtered using progressive morphological filter plus building footprint and outlier removal.

Image of surface model derived from points filtered using progressive morphological filter plus building footprint and outlier removal.

Posted in Other | Leave a Comment »

Finding peace, finding ground: Drone flights for hydrologic modeling

Posted by smathermather on October 15, 2016

Š-L-M

Lately I’ve been helping with project to predict flooding in Dar es Salaam (“Home of Peace”), the capital former capital of Tanzania in East Africa. This is a really fun project for me, in part because most of the hard work is already done, but there still remain some tricky problems to solve.

This project is a part of Dar Ramani Huria, a “community-based mapping project… training university students and local community members to create highly accurate maps of the most flood-prone areas of the city using OpenStreetMap.”

Dar es Salaam (or Dar for short), is the fastest growing city in Africa, making the identification of flood prone zones during rainy seasons absolutely critical. The Ramani Huria crew has mapped the streams, ditches, and other waterways of Dar, as well as flown imagery (via drone) over much of the city critical parts of the city.

screen-shot-2016-10-15-at-3-01-04-pm

Problem Space:

The results are stunning, but using drone imagery and photogrammetric point clouds (instead of LiDAR) has it’s limitations.

One problem, that I’m not going to get into in any depth today, is the difficulty of conflating (vertically and horizontally) different datasets flown at different times with commodity GPS.

Another problem is the difficulty of turning photogrammetrically derived point clouds into Digital Terrain Models. There is proprietary software that does this well (e.g. LasTools and others), but we seek a free and open source alternative. Let’s visualize the problem. See below a photogrammetrically-derived point cloud (processed in Pix4D, visualized at https://plas.io):

screen-shot-2016-10-15-at-3-15-44-pm

We can directly turn this into a digital surface model (DSM):

screen-shot-2016-10-15-at-3-24-23-pm

5

We can see the underlying topography, but buildings and trees are much of the signal in the surface model. If we, for example, calculate height above the nearest stream as a proxy for flooding, we get a noisy result.

OSM to the Rescue

If you recall at the beginning of this post, I said this is “a really fun project for me” because “most of the hard work is already done”. Dar is a city with pretty much every building digitized. We can take advantage of this work to remove buildings from our point cloud. We’ll use the utilities associated with the Point Data Abstraction Layer (PDAL) library. Specifically, we’ll use PDAL’s ability to clip point cloud data with 2D OGR compatible geometries (a great tutorial on this here).

First thing we need to do is extract the buildings from OSM into shapefiles. For this, and easy way is to use Mapzen’s Metro Extracts:

screen-shot-2016-10-15-at-3-58-35-pm

I chose Datasets grouped into individual layers by OpenStreetMap tags (IMPOSM) as this gives me the capacity to just choose “Buildings”. I loaded those buildings into a PostGIS database for the next steps. What I seek is a shapefile of all the areas which are not buildings, essentially from this:

building

To this:

no_building

For this we’ll need to use ST_Difference + ST_ConvexHull. ST_ConvexHull will create a shape of the extent of our data, and we’ll use ST_Difference to cookie-cutter out all the building footprints:

DROP TABLE IF EXISTS "salaam-buff-diff-split1";
CREATE TABLE "salaam-buff-diff-split1" AS
WITH subset AS (
	SELECT ST_Transform(geom, 32737) AS geom FROM "salaam-buildings" LIMIT 20000
	),
extent AS (
	SELECT ST_ConvexHull(ST_Union(geom)) AS geom FROM subset
	),
unioned AS (
	SELECT ST_Union(geom) AS geom FROM subset
	),
differenced AS (
	SELECT ST_Difference(a.geom, b.geom) AS geom FROM
	extent a, unioned b
	)
SELECT 1 AS id, geom FROM differenced;

This gives us a reasonable result. But, when we use this in PDAL, (I assume) we want a subdivided shape to ensure we don’t have to access the entire point cloud at any given time to do our clipping. We’ll add to this process ST_SubDivide, which will subdivide our shape into parts not to exceed a certain number of nodes. In this case we’ll choose 500 nodes per shape:

DROP TABLE IF EXISTS "salaam-buff-diff-split1";
CREATE TABLE "salaam-buff-diff-split1" AS
WITH subset AS (
	SELECT ST_Transform(geom, 32737) AS geom FROM "salaam-buildings" LIMIT 20000
	),
extent AS (
	SELECT ST_ConvexHull(ST_Union(geom)) AS geom FROM subset
	),
unioned AS (
	SELECT ST_Union(geom) AS geom FROM subset
	),
differenced AS (
	SELECT ST_Difference(a.geom, b.geom) AS geom FROM
	extent a, unioned b
	)
SELECT 1 AS id, ST_Subdivide(geom, 500) AS geom FROM
differenced;

subdivided

Finally, if we want to be sure to remove the points from the edges of buildings (we can assume the digitized buildings won’t perfectly match our point clouds), then we should buffer our shapes:

DROP TABLE IF EXISTS "salaam-buff-diff-split1";
CREATE TABLE "salaam-buff-diff-split1" AS
WITH subset AS (
	SELECT ST_Transform(geom, 32737) AS geom FROM "salaam-buildings" LIMIT 20000
	),
buffered AS (
	SELECT ST_Buffer(geom, 2, 'join=mitre mitre_limit=5.0') AS geom FROM subset
	),
extent AS (
	SELECT ST_ConvexHull(ST_Union(geom)) AS geom FROM subset
	),
unioned AS (
	SELECT ST_Union(geom) AS geom FROM Buffered
	),
differenced AS (
	SELECT ST_Difference(a.geom, b.geom) AS geom FROM
	extent a, unioned b
	)
SELECT 1 AS id, ST_Subdivide(geom, 500) AS geom FROM
differenced;

If we recall our before point cloud:

screen-shot-2016-10-15-at-3-15-44-pm

screen-shot-2016-10-15-at-4-38-53-pm

Now we have filtered out most buildings:

screen-shot-2016-10-15-at-4-36-15-pm

screen-shot-2016-10-15-at-4-36-31-pm

In order to do this filtering in PDAL, we do two things. First we create a json file that defines the filter:

{
  "pipeline":[
    "/data/2015-05-20_tandale_merged_densified_point_cloud_part_1.las",
    {
      "type":"filters.attribute",
      "dimension":"Classification",
      "datasource":"/data/salaam-buff-diff-split.shp",
      "layer":"salaam-buff-diff-split",
      "column":"id"
    },
    {
      "type":"filters.range",
      "limits":"Classification[1:1]"
    },
    "/data/2015-05-20_tandale_merged_densified_point_cloud_part_1_nobuild_buff.las"
  ]
}

Then we use these json definitions to apply the filter:

nohup sudo docker run -v /home/gisuser/docker/:/data pdal/pdal:1.3 pdal pipeline /data/shape-clip.json

Problems continue

This looks pretty good, but as we interrogate the data, we can see the artifacts of trees and some buildings still linger in the dataset.

screen-shot-2016-10-15-at-4-37-38-pm

screen-shot-2016-10-15-at-4-38-03-pm

 

Enter the Progressive Morphological Filter

It is possible to use the shape of the surface model to filter out buildings and trees. To do so, we start with the assumption that the buildings and vegetation shapes distinctive from the shape of the underlying ground. We have already used the hard work of the OpenStreetMap community to filter most of the buildings, but we still have some buildings and plenty of trees. PDAL has another great tutorial for applying this filter which we’ll leverage.

Again we need a JSON file to define our filter:

{
  "pipeline": {
    "name": "Progressive Morphological Filter with Outlier Removal",
    "version": 1.0,
    "filters": [{
        "name": "StatisticalOutlierRemoval",
        "setMeanK": 8,
        "setStddevMulThresh": 3.0
      }, {
        "name": "ProgressiveMorphologicalFilter",
        "setCellSize": 1.5
    }]
  }
}

And then use that filter to remove all the morphology and statistical outliers we don’t want:

nohup sudo docker run -v /home/gisuser/docker/:/data pdal/pdal:1.3 pdal pcl -i /data/2015-05-20_tandale_merged_densified_point_cloud_part_1_nobuild_buff.las -o /data/nobuild_filtered2.las -p /data/sor-pmf.json

This command will remove our colorization for the points, so we’ll see the colorization according to height only:

screen-shot-2016-10-15-at-4-53-55-pm

screen-shot-2016-10-15-at-4-54-49-pm

 

What remains is to interpolate this into digital terrain model. That is a project for another day.

Posted in 3D, Drone, Other, PDAL, Photogrammetry, UAS | Tagged: , , , | 1 Comment »

UAS Mapping: Positional Accuracy Assessment via GeoKota

Posted by smathermather on August 10, 2016

Introduction Recently we partnered up with folks from the University of Akron to help determine how accurate UAS are compared to traditional mapping methods. Given the current difficulty to fly commercially in the National Airspace, this partnership gave us a unique opportunity to fly inside their Field House. This controlled space had a lot of […]

via UAS Mapping: Positional Accuracy Assessment — GeoKota

Posted in Other | Leave a Comment »

The Worlds of #Mapzen Sphere Maps

Posted by smathermather on July 21, 2016

Guest blog post today from Brandon Garman (@brandon_garman):

After seeing Mapzen’s blog post on Sphere Maps, I wanted to try my hand at it. I downloaded the Github repository, set up a python simple server and ‘BOOM’ had my own instance running (Mapzen makes this process very easy). So next I pulled out my Wacom tablet, opened up Photoshop and began painting in circles with different colors and brushes. After I had a few results I was happy with, I loaded each one to see how I did. Not so well it turns out! I need to spend a bit more time practicing and fine-tuning before I can get any results half as nice as Mapzen’s examples.

Instead of hunkering down and refining my painted circles, I took a look at some of the non-hand drawn examples that came in the repo (glich, yinyang, checkerboard). Since all that needs uploaded is an image file, I decided to find some images of my own. PLANETS!! What better image to use to style terrain on earth than other planets! So, I picked a few images of our planetary neighbors and I have to say I’m very happy with some of the results.

MARS
This is probably my favorite result (the moon is a close second though, see below).

mars_pic

mars.PNG

NEPTUNE
As a cartographer, seeing an all blue hillshade is jolting, but oddly amusing. Still an excellent result though.

Neptune_pic

neptune.PNG

MOON
There are some great colors and contrast here, however the result was a little blah.
moon_pic.jpg

moon

Until I realized the majority of the dark colors were in the top left, which meant the lighting was backwards from what is typical. So I rotated the sphere image 180 degrees and the result was much better, close to that of Mars.
moon_rot.jpg

moon_rotate.PNG

EUROPA – one of Jupiter’s moons
Not too bad, but has the same inverted look as the moon did.
Europa_pic

europa.PNG

Here it is rotated 180 degrees.
Europa_rot.jpg

europa_rotated.PNG

EARTH
Obviously I was going to use Earth. This color scheme here is really great, but the level of detail in the data at this scale makes it look pixellated and hard to read.
earth_pic.jpg

earth.PNG

I tried blurring the image a bit to smooth out some of the choppiness of the colors and it helped.
earth_blur.jpg

earth_blur.PNG

JUPITER
I love this color scheme as well, but has the same problems as the Earth image did.
jupiter_pic

jupiter.PNG
I blurred this image as well. It helped a bit. Still a great color scheme though.
jupiter_blur

jupiter_blur.PNG

The best part about sphere map is that the only limitations are your creativity. The ability to create great looking maps is very easy. Also, the ability to create the most useless but entertaining maps is just as easy and is sometimes more fun…

mona_lisa_round

mona_lisa.PNG

starry_night_round.jpg

starry_night.PNG

 

Deathstar_blur

deathstar_blur.PNG

Posted in Other | Leave a Comment »

Viewing Sparse Point Clouds from OpenDroneMap — GeoKota

Posted by smathermather on June 30, 2016

This is a post about OpenDroneMap, an opensource project I am a maintainer for. ODM is a toolchain for post-processing drone imagery to create 3D and mapping products. It’s currently in beta and under pretty heavy development. If you’re interested in contributing to the project head over here. The Problem So for most of the […]

via Viewing Sparse Point Clouds from OpenDroneMap — GeoKota

Posted in 3D, Image Processing, OpenDroneMap, OpenDroneMap, Optics, Other, Photogrammetry | Tagged: , , , | Leave a Comment »