Smathermather's Weblog

Remote Sensing, GIS, Ecology, and Oddball Techniques

Posts Tagged ‘Analysis’

Landscape Position using GDAL — PT 3

Posted by smathermather on November 25, 2014

More landscape position pictures — just showing riparianess. See also

https://smathermather.wordpress.com/2014/11/22/landscape-position-using-gdal/

and

https://smathermather.wordpress.com/2014/11/24/landscape-position-using-gdal-pt-2/

valleyz3 valleyz2 valleyz1 valleyz

Map tiles by Stamen Design, under CC BY 3.0. Data by OpenStreetMap, under ODbL

Posted in Analysis, Ecology, GDAL, Landscape Position, Other, POV-Ray | Tagged: , , , , , , , , , , | Leave a Comment »

Landscape Position using GDAL — PT 2

Posted by smathermather on November 24, 2014

Just pretty pics today of estimated riparianess. If you prefer a bit of code, see previous post https://smathermather.wordpress.com/2014/11/22/landscape-position-using-gdal/

valleys_improved valleys_improved_1 valleys_improved_2 valleys_improved_3 valleys_improved_3.1 valleys_improved_4 valleys_improved_5

Posted in Analysis, Ecology, GDAL, Landscape Position, Other, POV-Ray | Tagged: , , , , , , , , , , | 4 Comments »

Landscape Position using GDAL

Posted by smathermather on November 22, 2014

Hat tip again to Seth Fitzsimmons. I’ve been looking for a good, easy to use smoothing algorithm for rasters. Preferably something so easy, I don’t even need to write a little python, and so efficient I can run it on 30GB+ datasets and have it complete before I get distracted again by the next shiny project (a few hours).

Seth’s solution? Downsample to a low resolution using GDAL, then sample back up to a higher resolution in order to smooth the raster. My innovation to his approach? Use Lanczos resampling to keep location static, and get a great smooth model:

Unsmoothed DEM

Unsmoothed DEM

Smoothed DEM

Smoothed DEM

Code to do this in gdal follows. “-tr” sets our resamping resolution, “-r lanczos” sets our resampling algorithm, and the “-co” flags are not strictly necessary, but I’ve got a 30GB dataset, so it helps to chop up the inside of the TIFF in little squares to optimize subsequent processing.

gdalwarp -tr 50 50 -srcnodata "0 -32767" -r lanczos  -co "BLOCKXSIZE=512" -co "BLOCKYSIZE=512" oh_leap_dem.tif oh_leap_dem_50.tif
gdalwarp -tr 10 50 -srcnodata "0 -32767" -r lanczos  -co "BLOCKXSIZE=512" -co "BLOCKYSIZE=512" oh_leap_dem_50.tif oh_leap_dem_10-50.tif

At first this excited me for cartographic reasons. We can use this to simplify contours, and then use simplified contours at different zoom levels for maps:

But, we can also use this for analyses. For example, if we difference these smoothed images with our original digital elevation model, we get a measurement of local elevation difference, the first step in establishing where valleys, ridges, and other land forms are.

# Resample to lower resolution
gdalwarp -tr 328.0523587211646 328.0523587211646 -srcnodata "0 -32767" -r lanczos  -co "BLOCKXSIZE=512" -co "BLOCKYSIZE=512" oh_leap_dem.tif oh_leap_dem_328.tif
# Upsample again to get nicely smoothed data
gdalwarp -tr 3.048293887897243 3.048293887897243 -srcnodata "0 -32767" -r lanczos  -co "BLOCKXSIZE=512" -co "BLOCKYSIZE=512" oh_leap_dem_328.tif oh_leap_dem_3-328.tif
# Merge two datasets together into single image as separate bands to ensure they are the same dimensions
# (gdal_calc, as a wrapper for numpy requires this)
gdal_merge -separate -o oh_leap_dem_3-328_m.tif oh_leap_dem.tif oh_leap_dem_3-328.tif
# And now we'll use gdal_calc to difference our elevation model with the smoothed one to get relative elevation 
gdal_calc -A oh_leap_dem_3-328_m.tif -B oh_leap_dem_3-328_m.tif --A_band=1 --B_band=2 --outfile=oh_leap_dem_lp_328.tif --calc="A-B"

So, if we want a good proxy for riparian zones, we can use a technique like this, instead of buffering our streams and rivers a fixed distance (in this case, I’ve used 4 different distances:

Map of landscape position estimated valleys in Cuyahoga County, Ohio

Map of landscape position estimated valleys in Cuyahoga County, Ohio

Pretty snazzy riparian finder. It seems to also find upland headwater wetlands (most of them historic and drained for Cuyahoga County). I am now running on 4 million acres of Ohio at a 10ft (~3 meter) resolution. It’s that efficient.

Addendum: It also finds escarpment edges, like the Portage Escarpment in the above, so it is a mix of a few landforms. Darn handy nonetheless.

Posted in Analysis, Ecology, GDAL, Landscape Position, Other, POV-Ray | Tagged: , , , , , , , , , , | 2 Comments »

FOSS4G Korea 2014, poor GPS photos, and mapillary (part 2 of n)

Posted by smathermather on September 2, 2014

A classic and age old problem in GPS is collecting potentially wonderful data in the field, getting back the office, and realizing a lot of manual scrubbing, data massaging, and other such careful work will need to be done to make the GPS data useful and meaningful. This assumes we can even meaningfully correct it at all.

This is true too (maybe especially) for GPS enabled cameras in canyons and urban canyons. This is a problem we started to explore in https://smathermather.wordpress.com/2014/08/31/foss4g-korea-2014-poor-gps-photos-and-mapillary/

Let’s return to the problem briefly. Were the GPS readings to be consistent and accurate, we should see a relatively straight line of points as the photos were taken along the length of sidewalk on Teheran-Ro in the Gangnam District of Seoul

Figure of raw data points showing anything other than a straight line

In addition to not looking straight, though it is supposed to follow a road, we previously demonstrated that there are a lot of points duplicated where, presumably, the camera was using cached GPS data rather than the latest available at the time of the photo. We can see this density of overlapping points even more clearly using the heatmap tool in QGIS:

Heatmap showing clumping of data points

The clump of red shows a clear issue of overlapping points. As these points are the GPS positions of photographs, we can match features between photographs (using structure from motion) to map out the relative location of these photos to each other. The points in the below figure show the matched points in 3 or more photos, the blue shapes represent camera positions within the scene.

Image of sparse point cloud and relative camera positions

If we look at just the camera locations on a map, we see something like the following:

Figure of camera center points in relative space

For the astute student however, it should not be surprising that the coordinates of these points are not recognizable as any known coordinate system. For example let’s view the X, Y, and Z of the first three points:

id	X	Y	Z
1	-0.357585	-0.390081	-3.48026
2	-0.326079	-0.367529	-3.24815
3	-0.295885	-0.348935	-2.98469
4	-0.272306	-0.334949	-2.79409

This means we require some equation to convert between our unreferenced (but consistent) data to a known coordinate system. To build this equation, we just need to know four things about our data with some certainty — the start point and end point X and Y positions. We will ignore Z for this exercise.

Point 1:
X-Local: -0.357585
X-KUCS: 958632.326047712
Y-Local: 1.29161
Y-KUCS: 958744.221397964

If we remember our trigonometry (or google our trigonometry…) then we’ll be able to solve for our X and Y values independently. For example for X:

X1 = 67.8485 * X + 958657

With that and our Y equation:

Y1 = 27.2400 * Y + 19444469

Now we can transform our local coordinates into the Korean 2000 Unified Coordinate system, and get a much nicer result:

Figure showing corrected camera position points

If we perform a heat map on this output, we’ll see that we have spread out our duplicate geometries to their correct, non-overlapping spacing:

Figure showing corrected camera position heat map

Whew! Now to write some code which does this for us… .

Oh, wait! We forgot the final test. How do they look together (pre and post transformed — post transformed as stars of course):
Pre and post transformed points compared in single figure

But, as we know Google (or in the case of Korea, Naver) is the all knowing authority on where thing are. How does this bear out against satellite imagery?:

Pre and post transformed points compared in single figure with aerial for comparison

Woah! That works for me. Notice, we can even see where I walked a bit to the left side at intersections to move around people and trees.

Posted in 3D, Analysis, Conference, Conferences, FOSS4G Korea | Tagged: , , , , | Leave a Comment »

KNN with FLANN and laspy, a starting place

Posted by smathermather on August 8, 2014

FLANN is Fast Library for Approximate Nearest Neighbors, which is a purportedly wicked fast nearest neighbor library for comparing multi-dimensional points. I only say purportedly, as I haven’t verified, but I assume this to be quite true. I’d like to move some (all) of my KNN calculations outside the database.

I’d like to do the following with FLANN– take a LiDAR point cloud and change it into a LiDAR height-above-ground point cloud. What follows is my explorations so far.

In a previous series of posts, e.g. https://smathermather.wordpress.com/2014/07/14/lidar-and-pointcloud-extension-pt-6/

pointcloud6

I have been using the point cloud extension in PostGIS. I like the 2-D chipping, but I think I should segregate my data into height classes before sending it into the database. In this way, I can query my data by height class and by location efficiently, taking full advantage of the efficiencies of storing all those little points in chips, while also being able to query the data in any of the dimensions I need to in the future. Enter FLANN.

I haven’t gotten far. To use FLANN with LiDAR through Python, I’m also using laspy.  There’s a great tutorial here: http://laspy.readthedocs.org/en/latest/tut_part_1.html

laspy

I make one change to the tutorial section using FLANN. The code as written is:

import laspy
import pyflann as pf
import numpy as np

# Open a file in read mode:
inFile = laspy.file.File("./laspytest/data/simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.

neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

To make this example work with the current version of pyflann, we need to make sure we import all of pyflann (or at least nn), and also set flann = FLANN() as follows:

import laspy
import numpy as np
from pyflann import *

# Open a file in read mode:
inFile = laspy.file.File("simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.
flann = FLANN()
neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

Finally, a small note on installation of pyflann on Ubuntu. What I’m about to document is undoubtedly not the recommended way to get pyflann working. But it worked… .

Installation for FLANN on Ubuntu can be found here: http://www.pointclouds.org/downloads/linux.html
pcl

But this does not seem to install pyflann. That said, it installs all our dependencies + FLANN, so…

I cloned, compiled, and installed the FLANN repo: https://github.com/mariusmuja/flann

git clone git://github.com/mariusmuja/flann.git
cd flann
mkdir BUILD
cd BUILD
cmake ../.
make
sudo make install

This get’s pyflann where it needs to go, and voila! we can now do nearest neighbor searches within Python.

Next step, turn my LiDAR xyz point cloud into a xy-height point cloud, then dump in height-class by height class into PostgreSQL. Wish me luck!

Posted in 3D, Database, FLANN, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , , | Leave a Comment »

LiDAR and pointcloud extension pt 6

Posted by smathermather on July 14, 2014

There’s all sorts of reasons why what I’ve been doing so far is wrong (see the previous 5 posts).  More on that later. But in case you needed 3D visualization of the chipping I’ve been working through, look no further:

Isometric 3D visualization of 3D chipping

Posted in 3D, Database, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | 1 Comment »

LiDAR and pointcloud extension pt 5

Posted by smathermather on July 11, 2014

Now for the crazy stuff:

Plaid-like overlay of vertical patches

The objective is to allow us to do vertical and horizontal summaries of our data. To do this, we’ll take chipped LiDAR input and further chip it vertically by classifying it.

First a classifier for height that we’ll use to do vertical splits on our point cloud chips:

CREATE OR REPLACE FUNCTION height_classify(numeric)
  RETURNS integer AS
$BODY$
 
SELECT
	CASE WHEN $1 < 10 THEN 1
	     WHEN $1 >= 10 AND $1 < 20 THEN 2
	     WHEN $1 >= 20 AND $1 < 30 THEN 3
	     WHEN $1 >= 30 AND $1 < 40 THEN 4
	     WHEN $1 >= 40 AND $1 < 50 THEN 5
	     WHEN $1 >= 50 AND $1 < 60 THEN 6
	     WHEN $1 >= 60 AND $1 < 70 THEN 7
	     WHEN $1 >= 70 AND $1 < 80 THEN 8
	     WHEN $1 >= 80 AND $1 < 90 THEN 9
	     WHEN $1 >= 90 AND $1 < 100 THEN 10
	     WHEN $1 >= 100 AND $1 < 110 THEN 11
	     WHEN $1 >= 110 AND $1 < 120 THEN 12
	     WHEN $1 >= 120 AND $1 < 130 THEN 13
	     WHEN $1 >= 130 AND $1 < 140 THEN 14
	     WHEN $1 >= 140 AND $1 < 150 THEN 15
	     WHEN $1 >= 150 AND $1 < 160 THEN 16
	     WHEN $1 >= 160 AND $1 < 170 THEN 17	     
	     WHEN $1 >= 170 AND $1 < 180 THEN 18
	     ELSE 0
	END
	FROM test;
 
$BODY$
  LANGUAGE sql VOLATILE
  COST 100;

And now, let’s pull apart our point cloud, calculate heights from approximate ground, then put it all back together in chips grouped by height classes and aggregates of our original chips. Yes, I will explain more later. And don’t do this at home– I’m not even sure if it makes sense yet… :

/*
First we explode the patches into points,
 and along the way grab the minimum value
for the patch, and the id
*/
WITH lraw AS (
    SELECT PC_Explode(pa) AS ex,  PC_PatchMin(pa, 'Z') AS min, id
    FROM test1
    ),
/*
Calculate our height value ( Z - min )
*/
heights AS (
    SELECT PC_Get(ex, 'X') || ',' || PC_Get(ex, 'Y') || ',' -- grab X and Y
    || PC_Get(ex, 'Z') - min                -- calculate height
    || ',' ||
-- And return our remaining dimensions
    PC_Get(ex, 'Intensity') || ',' || PC_Get(ex, 'ReturnNumber') || ',' || PC_Get(ex, 'NumberOfReturns') || ',' ||
    PC_Get(ex, 'ScanDirectionFlag') || ',' || PC_Get(ex, 'EdgeOfFlightLine') || ',' ||
    PC_Get(ex, 'Classification') || ',' || PC_Get(ex, 'ScanAngleRank') || ',' || PC_Get(ex, 'UserData') || ',' ||
    PC_Get(ex, 'PointSourceId') || ',' || PC_Get(ex, 'Time') || ',' || PC_Get(ex, 'PointID') || ',' || PC_Get(ex, 'BlockID')
    AS xyheight, PC_Get(ex, 'Z') - min AS height, id FROM lraw
    ),
-- Now we can turn this aggregated text into an array and then a set of points
heightcloud AS (
    SELECT PC_MakePoint(1, string_to_array(xyheight, ',')::float8[]) pt, height_classify(height) heightclass, id FROM heights
    )
-- Finally, we bin the points back into patches grouped by our heigh class and original ids.
SELECT PC_Patch(pt), id / 20 AS id, heightclass FROM heightcloud GROUP BY id / 20, heightclass;

The resultant output should allow us to query our database by height above ground and patch. Now we can generate vertical summaries of our point cloud. Clever or dumb? It’s too early to tell.

Posted in 3D, Database, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | Leave a Comment »

LiDAR and pointcloud extension pt 4

Posted by smathermather on July 10, 2014

Height cloud overlayed with patchs

Now, we navigate into unknown (and perhaps silly) territory. Now we do point level calculations of heights, and drop the new calculated points back into the point cloud as though it were a Z value. I don’t recommend this code as a practice– this is as much as me thinking aloud as anything Caveat emptor:

/*
First we explode the patches into points,
 and along the way grab the minimum value
for the patch, and the id
*/
WITH lraw AS (
	SELECT PC_Explode(pa) AS ex,  PC_PatchMin(pa, 'Z') AS min, id
	FROM test1
	),
/*
Calculate our height value ( Z - min ) 
*/
heights AS (
	SELECT PC_Get(ex, 'X') || ',' || PC_Get(ex, 'Y') || ',' -- grab X and Y
	|| PC_Get(ex, 'Z') - min				-- calculate height
	|| ',' ||
-- And return our remaining dimensions
	PC_Get(ex, 'Intensity') || ',' || PC_Get(ex, 'ReturnNumber') || ',' || PC_Get(ex, 'NumberOfReturns') || ',' ||
	PC_Get(ex, 'ScanDirectionFlag') || ',' || PC_Get(ex, 'EdgeOfFlightLine') || ',' ||
	PC_Get(ex, 'Classification') || ',' || PC_Get(ex, 'ScanAngleRank') || ',' || PC_Get(ex, 'UserData') || ',' ||
	PC_Get(ex, 'PointSourceId') || ',' || PC_Get(ex, 'Time') || ',' || PC_Get(ex, 'PointID') || ',' || PC_Get(ex, 'BlockID')
	AS xyheight, id FROM lraw
	),
-- Now we can turn this aggregated text into an array and then a set of points
heightcloud AS (
	SELECT PC_MakePoint(1, string_to_array(xyheight, ',')::float8[]) pt, id FROM heights
	)
-- Finally, we bin the points back into patches
SELECT PC_Patch(pt) FROM heightcloud GROUP BY id / 20;

Posted in 3D, Database, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | Leave a Comment »

LiDAR and pointcloud extension pt 3

Posted by smathermather on July 8, 2014

Digging a little deeper. Ran the chipper on a smaller number of points and then am doing a little hacking to get height per chip (if you start to get lost, go to Paul Ramsey’s tutorial).

Here’s my pipeline file. Note the small chipper size– 20 points per chip.

<?xml version="1.0" encoding="utf-8"?>
<Pipeline version="1.0">
  <Writer type="drivers.pgpointcloud.writer">
    <Option name="connection">dbname='pggis' user='pggis' host='localhost'</Option>
    <Option name="table">test1</Option>
    <Option name="srid">3362</Option>
    <Filter type="filters.chipper">
      <Option name="capacity">20</Option>
      <Filter type="filters.cache">
        <Reader type="drivers.las.reader">
          <Option name="filename">60002250PAN.las</Option>
          <Option name="spatialreference">EPSG:3362</Option>
        </Reader>
      </Filter>
    </Filter>
  </Writer>
</Pipeline>

Easy enough to load (though slow for the sake of the chip size):

pdal pipeline las2pg.xml

Now we can do some really quick and dirty height calculations per chip — like what’s the difference between the minimum and maximum value in the chip:

DROP TABLE IF EXISTS test1_patches;

CREATE TABLE patch_heights AS
SELECT
  pa::geometry(Polygon, 3362) AS geom,
  PC_PatchAvg(pa, 'Z') AS elevation,
  PC_PatchMax(pa, 'Z') - PC_PatchMin(pa, 'Z') AS height
FROM test1;

Patch heights showing vegetation and open areas

Posted in 3D, Database, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | Leave a Comment »

LiDAR and pointcloud extension pt 2

Posted by smathermather on July 8, 2014

As much for my edification as anyones:
Want to get at just a few points within the pointcloud extension? A with query will get you there:

DROP TABLE IF EXISTS points;

CREATE TABLE points AS
WITH
-- get me some patches (our lidar points are stored in patches)
patches AS (
  SELECT pa FROM test
	LIMIT 2
),
-- Now, I want all the points from each patch
pa_pts AS (
  SELECT PC_Explode(pa) AS pts FROM patches
)
-- OK. Can I have that as standard old points, PostGIS style?
SELECT pts::geometry FROM pa_pts;
--k thanx

LiDAR points shown within their patches

Posted in 3D, Database, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , | Leave a Comment »