Smathermather's Weblog

Remote Sensing, GIS, Ecology, and Oddball Techniques

Archive for August, 2014

FOSS4G Korea 2014 and the tale of the three Stephens

Posted by smathermather on August 31, 2014

Much is going on in Korea. It is and will be a place to watch for Open Source GeoSpatial, with the likes of Sanghee Shin and all his local chapter OSGeo compadres leading the charge. FOSS4G Korea was part of Smart Geospatial Expo 2014 this year, and during the expo, Sanghee was awarded a prize from the South Korean president for his work in promoting Open Source geospatial technologies. To hear Sanghee explain it, Korea is very much interested in growing its industries through the minds of their people. The great successes of South Korea in the recent past have not been dependent upon the natural resources of Korea, but the intellectual capital of firms such as LG and Samsung. With this in mind, Korea wants to grow it’s software industries from native seed. Sponsorship of Open Source GeoSpatial technologies will be part of this initiative. It seems that they have the talent, the energy, the love of topic, and now the financial resources to start leading. This will be really fun to watch.

But, for this post, I want to focus on the tale of three Stephens, perhaps an echo of narcissism, but an interesting filter for our post today. Not long before I left for this trip, I contacted Sanghee to ask if there were any parks + GIS folks I could connect with while here to talk. Photo of Mr. Yu, B.J. Jang, and Stephen Mather at Smart GeoSpatial Expo 2014 It turns out, Mr. Byeong-hyeok Yu was presenting at FOSS4G Korea on the use of QGIS for remote sensing for the Korean National Park Service. Mr. Yu is one of two GIS people working at KNPS, the other is with their research branch. Mr. Yu was good enough to invite me to headquarters, give me the overview on smartphone apps, QGIS analyses, UAS (drone) flight work, Google StreetView like trail work (in partnership with Naver maps, I believe), and other cutting edge initiatives they are working on. What FOSSGIS has enabled under Mr. Yu’s stewardship is the ability for not just Mr. Yu to do GIS, but also for him to democratize the process and allow a few hundred of KNPS park rangers to use QGIS on their desktops as well as to be the bearers of the equipment for the trail camera project. I like to think of Mr. Yu as a more bright and energetic version of me. After all, he is a FOSS4Geospatial parks guy. So, we’ll call Mr. Yu Stephen Mather number 2 (I did call this narcissism, right).

But here’s the real reason for the tale of three Stephens. Whoever has studied the history of the National Park Service in the US knows of Stephen Tyng Mather, the borax mining magnate and essential founder of the National Park System, who oversaw the development of 20 National Parks in his short tenure. While visiting the headquarters of KNPS, I was given the privilege of an audience and traditional Korean tea ceremony with tireless servant, Stephen Tyng Mather equivalent, and employee number 1 of Korea NPS, Mr. Young-Deck Park. Mr. Park has seen the KNPS from an idea to 21 parks covering more than 3-percent of Korea’s land area (more than 6% if you include the ocean refuges). In short, I was in the presence of a parks visionary and giant. Stephen Mather number 3. I won’t lie. I had to hold back tears. Oh, and the green tea was the best I have ever had.

To close this post, I’ll show you a view from KNPS’ most beloved, and least remote National Park — Bukhansan, which sits north of the Blue House (Korea’s equivalent of our White House) and partially inside the city of Seoul. At 11 million visitors a year and about 19,000 acres, it is quite popular. I hiked one of the ridges with one of their rangers. More on that in another post.

A view from Bukhansan National Park over Seoul

Posted in Conference, Conferences, FOSS4G Korea | Tagged: , , , , | 1 Comment »

FOSS4G Korea 2014, poor GPS photos, and mapillary

Posted by smathermather on August 31, 2014

As I have been moving around, whether traveling to Seoul or within Seoul, I have taken a lot of pictures. Some have GPS and I’ve processed to sent to Mapillary, like a few hundred I took on a day wandering Seoul:

Screen shot of Mapillary overview of SeoulI’ve taken a lot of strange videos too. I took a couple videos of my feet in the subway train just to get the music that plays to notify passengers of an approaching stop. Walking around Bukhansan National Park, I have taken many sign pictures. As I work for a park district, how signage and wayfinding are handled here is facinating, both from what I can understand, i.e. choice of material, color, frequency, how the letters are carved, to those elements that I cannot yet understand, i.e. exactly how the Korean Language wayfinding portions work.

But mostly I have been cataloging as much as I can in order to give my children a sense and feel for the city. I am realizing this imperative has given me a child-like view of the city. (Of course, my enthusiasm for the mundane does get me the occasional funny look from locals… . But hey! What could feel more like home than people thinking I am a little strange.)

This blog wouldn’t be mine without a technical twist to the narrative, so let’s dive into some geographic problems worth solving: The camera I have has built in GPS and compass, which makes it seemingly ideal for mapillary uploads. Except the GPS isn’t that accurate, doesn’t always update from photo to photo, struggles in urban areas in general, etc. etc.. And so it is that I am working on a little solution for that problem. First let me illustrate the problem better.

A sanity check on the GPS of the data can easily be done in QGIS using the Photo2Shape plugin:

Screen snapshot of photo2shape plugin install screen

Screenshot of distribution of camera GPS points in QGIS

Let’s do two things to improve our map. For old-time sake, we’ll add a little red-dot-fever, and use one of the native tile maps, Naver, via the TMS for Korea plugin.

Naver map with photo locations overlayed as red dots

We can see our points are both unevenly distributed and somewhat clumped. How clumped? Well, according to my fellow GeoHipsters on twitter, hex bin maps so 2013, so instead we’ll just use some feature blending (multiply) plus low saturation on our red (i.e. use pink) to show intensity of overlap:

Capture of map showing overlap of points with saturation of color increasing where overlaps exist.

Ok, that’s a lot of overlaps for pictures that were taken in a straight line series. Also, note the line isn’t so straight. Yes, I was sober. No, not even with soju can I walk though so many buildings.

Like all problems when I’m obsessed with a particular technology: “The solution here is to use <particular technology with which I am currently obsessed>”. In this case, we substitute <particular technology with which I am currently obsessed> with ‘Structure from Motion’ or OpenDroneMap. ODM would give us the correct relative locations of the original photos. Combined with the absolute locations (as bad as they are) perhaps we could get a better estimate. Here’s a start (confession — mocked up in Agisoft Photoscan. Sssh. Don’t tell) showing in blue the correct relative camera positions:

Image of sparse point cloud and relative camera positions

See how evenly spaced the camera positions are? You can also see the sparse point cloud which hints at the tall buildings of Gangnam and the trees in the boulevard.

  • Next step: Do this in OpenDroneMap.
  • Following Step: Find appropriate way to correlate with GPS positions.
  • Then: Correct model to match real world.
  • Finally: Update GPS ephemeris in original photos.

So, Korea has inspired another multi-part series. Stay tuned.

Posted in 3D, Analysis, Conference, Conferences, FOSS4G Korea | Tagged: , , , , | 2 Comments »

FOSS4G Korea 2014 and FOSS4G 2015 (Korea!)

Posted by smathermather on August 30, 2014

I have much to say about my experiences this past week at FOSS4G Korea, but I’ll keep this very short for now. If you live in North America or Europe and are thinking about not going to FOSS4G 2015 because it’s too far, rethink it. Waiting for you on the other side of the world is Korea, a complex mix of old and new, mountain and ocean, kind and fierce. I can’t say enough good about the hospitality I have received, the food I have eaten, the quality of the conference attendees, the energy and the love for OSGeo.

For the moment, I will leave you with this image of a temple entrance near Bukhansan National Park which effectively says, “Endeavor to explore the world”. It’s a geographer’s temple, I think.

Image of temple

Posted in Conference, Conferences, FOSS4G Korea | Tagged: , | Leave a Comment »

KNN with FLANN and laspy, a starting place

Posted by smathermather on August 8, 2014

FLANN is Fast Library for Approximate Nearest Neighbors, which is a purportedly wicked fast nearest neighbor library for comparing multi-dimensional points. I only say purportedly, as I haven’t verified, but I assume this to be quite true. I’d like to move some (all) of my KNN calculations outside the database.

I’d like to do the following with FLANN– take a LiDAR point cloud and change it into a LiDAR height-above-ground point cloud. What follows is my explorations so far.

In a previous series of posts, e.g. https://smathermather.wordpress.com/2014/07/14/lidar-and-pointcloud-extension-pt-6/

pointcloud6

I have been using the point cloud extension in PostGIS. I like the 2-D chipping, but I think I should segregate my data into height classes before sending it into the database. In this way, I can query my data by height class and by location efficiently, taking full advantage of the efficiencies of storing all those little points in chips, while also being able to query the data in any of the dimensions I need to in the future. Enter FLANN.

I haven’t gotten far. To use FLANN with LiDAR through Python, I’m also using laspy.  There’s a great tutorial here: http://laspy.readthedocs.org/en/latest/tut_part_1.html

laspy

I make one change to the tutorial section using FLANN. The code as written is:

import laspy
import pyflann as pf
import numpy as np

# Open a file in read mode:
inFile = laspy.file.File("./laspytest/data/simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.

neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

To make this example work with the current version of pyflann, we need to make sure we import all of pyflann (or at least nn), and also set flann = FLANN() as follows:

import laspy
import numpy as np
from pyflann import *

# Open a file in read mode:
inFile = laspy.file.File("simple.las")
# Grab a numpy dataset of our clustering dimensions:
dataset = np.vstack([inFile.X, inFile.Y, inFile.Z]).transpose()

# Find the nearest 5 neighbors of point 100.
flann = FLANN()
neighbors = flann.nn(dataset, dataset[100,], num_neighbors = 5)
print("Five nearest neighbors of point 100: ")
print(neighbors[0])
print("Distances: ")
print(neighbors[1])

Finally, a small note on installation of pyflann on Ubuntu. What I’m about to document is undoubtedly not the recommended way to get pyflann working. But it worked… .

Installation for FLANN on Ubuntu can be found here: http://www.pointclouds.org/downloads/linux.html
pcl

But this does not seem to install pyflann. That said, it installs all our dependencies + FLANN, so…

I cloned, compiled, and installed the FLANN repo: https://github.com/mariusmuja/flann

git clone git://github.com/mariusmuja/flann.git
cd flann
mkdir BUILD
cd BUILD
cmake ../.
make
sudo make install

This get’s pyflann where it needs to go, and voila! we can now do nearest neighbor searches within Python.

Next step, turn my LiDAR xyz point cloud into a xy-height point cloud, then dump in height-class by height class into PostgreSQL. Wish me luck!

Posted in 3D, Database, FLANN, Other, pointcloud, PostGIS, PostgreSQL | Tagged: , , , , , , | Leave a Comment »

Drivetime analyses, pgRouting

Posted by smathermather on August 6, 2014

Map of overlapping 5-minute drive times from park access points
We’ve got some quick and dirty pgRouting-based code up on github. I say quick and dirty because it directly references the table names in both of the functions. I hope to fix this in the future.

The objective with this code is to input a point, use a nearest neighbor search to find the nearest intersection, and from that calculate a drive time alpha shape.

First, the nearest neighbor search:

CREATE OR REPLACE FUNCTION pgr.nn_search (geom geometry) RETURNS
int8 AS $$

SELECT id FROM pgr.roads_500_vertices_pgr AS r
ORDER BY geom <#> r.the_geom
LIMIT 1;

$$ LANGUAGE SQL VOLATILE;

This function takes one argument– pass it a point geometry, and it will do a knn search for the nearest point. Since we are leveraging pgRouting, it simply returns the id of the nearest intersection point. We wrote this as a function in order to run it, e.g. on all the points in a table. As I stated earlier, it directly references a table name, which is a little hackerish, but we’ll patch that faux pax later.

Now that we have the ID of the point in question, we can do a driving distance calculation, wrap that up in an alpha shape, and return our polygon. Again, we write this as a function:

CREATE OR REPLACE FUNCTION pgr.alpha_shape (id integer, minutes integer) RETURNS
geometry AS $$

WITH alphashape AS(SELECT pgr_alphaShape('WITH
DD AS (
SELECT seq, id1 AS node, cost
FROM pgr_drivingDistance(''SELECT id, source, target, cost FROM pgr.roads_500'',' || id || ', ' || minutes || ', false, false)),
dd_points AS (
SELECT id_ AS id, x, y
FROM pgr.roads_500_vertices_pgr v, DD d
WHERE v.id = d.node)
SELECT * FROM dd_points')),

alphapoints AS (
SELECT ST_Makepoint((pgr_alphashape).x, (pgr_alphashape).y) FROM alphashape),

alphaline AS (
SELECT ST_MakeLine(ST_MakePoint) FROM alphapoints)

SELECT ST_MakePolygon(ST_AddPoint(ST_MakeLine, ST_StartPoint(ST_MakeLine))) AS the_geom FROM alphaline

$$ LANGUAGE SQL VOLATILE;

Finally, we’ll use these functions in conjunction with a set of park feature access points to map our our 5-minute drive time. Overlaps in 5-minute zones we’ve rendered as a brighter green in the image above.

CREATE TABLE pgr.alpha_test AS
WITH dest_ids AS (
SELECT pgr.nn_search(the_geom) AS id FROM pgr.dest_pts
)
SELECT pgr.alpha_shape(id::int, 5)::geometry, id FROM dest_ids;

Oh, and I should point out that for the driving distance calculation, we have pre-calculated the costs of the roads based on the speed limit, so cost is really travel time in minutes.

Posted in Analysis, Database, pgRouting, PostGIS, PostgreSQL, SQL | Tagged: , , | 4 Comments »

Using Spatial Data in R to Estimate Home Ranges (guest blog post)

Posted by smathermather on August 5, 2014

Overlay of daytime and nightime home ranges and coyote points (A guest blog post from Dakota Benjamin today, a Case Western Reserve University senior, and 3 year Summer intern we’ve been luck enough to recruit)

This post has two big take-aways: using spatial data in R, and applying that knowledge to estimating home ranges. If you are not familiar with the R environment, there are many great resources to familiarizing yourself with this powerful language, such as [here](http://cran.r-project.org/doc/manuals/R-intro.pdf) and [here](http://www.statmethods.net/index.html). You will need a basic understanding of R before proceeding.

The package [rgdal](http://cran.r-project.org/web/packages/rgdal/) is essential for utilizing spatial data in R. There are two functions that we will use. First is readOGR(). This function will import a shape file into a special type of data frame, in this case a SpatialPointsDataFrame, which contains the data, coordinates, bounding box, etc. which makes it all accessible in R.

 coyote <- readOGR("nc_coyote_centroids_dn","nc_coyote_centroids_dn") 

The second function we will use is writeOGR(), which as you can probably guess will write a SpatialDataFrame to a shape file (or other format).

 writeOGR(ver, "hr_coyote_centroids_dn", "hr_centroids_dn", "ESRI Shapefile") 

Now we can work with the coyote data in R and use some tools to estimate the home range. The previous blog post shows how to clean up the data using PostGIS (https://smathermather.wordpress.com/2014/07/31/cleaning-animal-tracking-data-throwing-away-extra-points/) will be useful for cleaning up and framing the data in a useful way. What if we want to look at how the home range changes between night and day? I wrote a quick function to add a column to the data. calcDayNight() uses a function from the [RAtmosphere package](http://cran.r-project.org/web/packages/RAtmosphere/) called suncalc(), which calculates the sunrise and sunset for a given date and location. calcDayNight() compares each data point to the sunrise and sunset times and determines whether the point was taken during the day or at night. Here’s the code:

calcDayNight <- function(x) {
  #if time is greater than the sunrise time or less than the sunset time, apply DAY
  suntime <- suncalc(as.numeric(as.Date(x["LMT_DATE"], format="%m-%d-%y") - as.Date("2013-01-01")), Lat=41.6, Long=-81.4)
  coytime <- as.numeric(strptime(x["LMT_TIME"], "%T") - strptime("00:00:00", "%T"))
 
  if(coytime > suntime$sunrise & coytime < suntime$sunset){
    x["dayornight"] <- "DAY"
  } else x["dayornight"] <- "NIGHT"
}

After we get the day and night centroids (from the previous post), we can do our home range estimation. The package we need is [adehabitatHR](http://cran.r-project.org/web/packages/adehabitatHR/). There’s two steps here: create the utilization distribution, and produce a home range contour from that distribution.

The function for creating the distribution is kernelUD(). More information about this model can be found [here](http://cran.r-project.org/web/packages/adehabitatHR/adehabitatHR.pdf) on page 33.

ud <- kernelUD(coyote[,2], h="href")

coyote[,2] refers to the “dayornight” column that contains the “DAY” and “NIGHT” values. Two different distributions will be calculated. Then we will produce a SpatialPolygonsDataFrame using the getverticeshr(), which will be the 95% estimation of the home range:

ver <- getverticeshr(ud, 95)

Export that as I showed above and we get a shape file with two polygons. Above is the home range estimation with the day/night centroids overlaid (blue for day and purple for night).

Posted in Analysis, R | Tagged: , | Leave a Comment »

Bicycling for Mapillary

Posted by smathermather on August 2, 2014

We’ve got two wheels now, baby! (and I’ve refined this a bit from a bungee only setup).  Here’s my rig for bicycling for Mapillary photos.  This uses the same gimbal for stabilizing and leveling the photos, so it should have similarly great output.

Mapillary output yet to come for this one.  In the mean time, you can watch the camera gimbal come alive:

Also you can look at the stabilized unicycle versions here: https://mapillary.github.io/mapillary_examples/seven_hills.html and here: http://www.mapillary.com/map/im/16/41.38742655884223/-81.69359564781189

mapillary

 

Posted in Mapillary | Tagged: | Leave a Comment »

Unicycling for Mapillary

Posted by smathermather on August 1, 2014

I really dig the crowd-sourced street-view service, Mapillary, and like to contribute to it, but doing a good job of getting blur-free images is a little tricky. This is double true in dark, under forest canopy trails. So, we solved it with some technology. Enter, a GoPro camera, a gimbal for an RC copter (drone), and some old bicycle parts:

The gimbal is a two axis gimbal which keeps the roll and pitch constant to 0.01-degrees, so it eliminates shake, and keeps the orientation of the photos consistent. The one we got costs around $180, although now I find them for as low as $130 (not my video):

And then we attached it to a makeshift unicycle:

Some post-processing to match it to our GPS and voila!

http://www.mapillary.com/map/im/rX7zM6I3VdDjSAAPBhPZ-g

Posted in Mapillary | Tagged: | Leave a Comment »