In general, I like the results from the hillshade. One problem that I forgot to take into account– hillshade algorithms typically leave a 1-pixel border along the edge where calculations cannot take place… . For this Ohio dataset, that means that every 5000 feet or so (the x and y dimension of our DEM chunks), we have a 2-pixel wide line, resulting in a subtle and interesting grid. Now this is a state plane grid, so I might be able to pretend it’s intentional… . Maybe I need to mosaic the DEM and then compute hillshade.
Archive for December, 2009
Posted by smathermather on December 23, 2009
Posted by smathermather on December 22, 2009
Using the same LiDAR DEM from which we generated the contours, we can create hillshade tifs using http://www.perrygeo.net/wordpress/?p=7. It compiles easily on a mac, probably even easier on a Linux machine following his directions. Then another simple loop:
#!/bin/bash x=0 for f in $( ls *.txt); do x=`expr $x + 1` echo $x $f hillshade $f $f.tif done
I’d like this all in a single tif:
or in my case with such a large area:
gdal_merge.py -co "BIGTIFF=YES" *.tif
gdaladdo -r average out.tif 2 4 8 16 32 64 128 256 512 1024
Posted by smathermather on December 21, 2009
Suppose you had a list of addresses coded like this:
00014 SOUTH ST
That you’d prefer coded like this:
14 SOUTH ST
If you were addicted to modifying things programmatically in Excel, like I quite guiltily do, you could do this in MS Excel.
=(MID(A1,1,FIND(" ",A1)-1))*1 & MID(A1,FIND(" ",A1),LEN(A1))
Everything before the ampersand (&) uses the =mid() function to grab everything up until the first space and multiplies it by one. Excel automatically does a text to number conversion if you do any mathematical manipulations, so this converts our text to a number, therefore trimming off the spare zeros. You could use the =value() worksheet function here, but why bother? After the ampersand, we use the mid function to grab everything after and including the first space. The concatenation of the two is our cleaned up address.
I know. Not really free and open source GIS at all… .
Posted by smathermather on December 21, 2009
Well, I made some mistakes in the last post, not the least of which is I used the wrong flag for creating an attribute field with elevation. What follows is a little more sophisticated. It takes us from a series of DEM tiles from which I want 2-foot and 5-foot contours (using gdal_contour), and then dumps those shapefiles into PostgreSQL using shp2pgsql.
First we prep the database. Again, I used pre-built binaries from Kyng Chaos.
I created a database called “Vinton” using my favorite PostgreSQL management tool, PGAdminIII (actually, it’s the only one I’ve ever tried…).
So, let’s make Vinton spatially aware:
sudo su - postgres -c '/usr/local/pgsql/bin/createlang plpgsql Vinton' sudo su - postgres -c '/usr/local/pgsql/bin/psql -d Vinton -f /usr/local/pgsql/share/contrib/postgis.sql' sudo su - postgres -c '/usr/local/pgsql/bin/psql -d Vinton -f /usr/local/pgsql/share/contrib/spatial_ref_sys.sql'
Vinton county is in Ohio State Plane South. Our dataset is a foot version of the State Plane South HARN. Looking it up on SpatialReference.org, I get a description not unlike the PostGIS insert statement below, but I changed the EPSG number from 102323 to 1023236 (arbitrarily), and changed the units to feet (from the default for HARN, which is meters):
so that the full insert is:
INSERT into spatial_ref_sys (srid, auth_name, auth_srid, proj4text, srtext) values ( 91023236, 'esri', 1023236, '+proj=lcc +lat_1=38.73333333333333 +lat_2=40.03333333333333 +lat_0=38 +lon_0=-82.5 +x_0=600000 +y_0=0 +ellps=GRS80 +units=m +no_defs ', 'PROJCS["NAD_1983_HARN_StatePlane_Ohio_South_FIPS_3402",GEOGCS["GCS_North_American_1983_HARN",DATUM["NAD83_High_Accuracy_Regional_Network",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",600000],PARAMETER["False_Northing",0],PARAMETER["Central_Meridian",-82.5],PARAMETER["Standard_Parallel_1",38.73333333333333],PARAMETER["Standard_Parallel_2",40.03333333333333],PARAMETER["Latitude_Of_Origin",38],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","1023236"]]');
Now let’s create some tables to house our future contour data:
CREATE TABLE contours_2 ( gid serial NOT NULL, id integer, elev numeric, the_geom geometry, CONSTRAINT contours_2_pkey PRIMARY KEY (gid), CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2), CONSTRAINT enforce_geotype_the_geom CHECK (geometrytype(the_geom) = 'MULTILINESTRING'::text OR the_geom IS NULL), CONSTRAINT enforce_srid_the_geom CHECK (st_srid(the_geom) = 91023236) ) WITH (OIDS=FALSE); ALTER TABLE contours_2 OWNER TO postgres; CREATE TABLE contours_5 ( gid serial NOT NULL, id integer, elev numeric, the_geom geometry, CONSTRAINT contours_5_pkey PRIMARY KEY (gid), CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2), CONSTRAINT enforce_geotype_the_geom CHECK (geometrytype(the_geom) = 'MULTILINESTRING'::text OR the_geom IS NULL), CONSTRAINT enforce_srid_the_geom CHECK (st_srid(the_geom) = 91023236) ) WITH (OIDS=FALSE); ALTER TABLE contours_5 OWNER TO postgres;
And finally, we can loop through all of our files, and dump the results first to a shapefile, then pipe that from shp2pgsql directly into the database:
#!/bin/bash x=0 for f in $( ls *.txt); do x=`expr $x + 1` echo $x $f echo Two Foot: gdal_contour -i 2 -a elev $f $f.shp /usr/local/pgsql/bin/shp2pgsql -s 91023236 -a $f.shp contours_2 | /usr/local/pgsql/bin/psql -U postgres -d Vinton
echo Five Foot: gdal_contour -i 5 -a elev $f 5foot/$f.shp /usr/local/pgsql/bin/shp2pgsql -s 91023236 -a 5foot/$f.shp contours_5 | /usr/local/pgsql/bin/psql -U postgres -d Vinton done
Now, I’ll probably convert all those numeric contour fields to integer with a cast statement, but tonight, I let the little laptop do the work… .
Posted by smathermather on December 20, 2009
Just another quick vignette. From the Ohio Statewide Imagery Program (OSIP) there is a 1-meter DEM for all of Ohio. To get contours from this dataset, one approach is to use GDAL tools, i.e. gdal_contours. As I’m working on a Mac today, I used Kyng Chaos pre-compiled Frameworks:
Then I needed to update my path variable in the BASH shell:
Now we can loop through, again, inelegantly (we’ll get unwanted extensions, i.e. *.txt.shp) and create 1-ft contours for a whole county. Why 1-foot (the OSIP dataset is meant for 2-foot contours). Well, 1-foot so that later if I want 5-foot contours, I can have them… .
#!/bin/bash x=0 for f in $( ls *.txt); do x=`expr $x + 1` # echo number of file we are on + name of file echo $x $f # perform analysis, output to *.shp gdal_contour -i 1 -nln elev $f $f.shp done
And the stdout and stderr:
1 S1890435.txt 0...10...20...30...40...50...60...70...80...90...100 - done. 2 S1890440.txt 0...10...20...30...40...50...60...70...80...90...100 - done. 3 S1890445.txt 0...10...20...30...40...50...60...70...80...90...100 - done. 4 S1890450.txt 0...10...20...30...40...50...60.. etc...
Posted by smathermather on December 12, 2009
At home, I work on a Mac, so when I want to do work here, my scripting language changes. In this case, I prefer BASH as my control script. I downloaded libLAS and did the standard Unix ./configure, make, sudo make install dance in the decompressed libLAS directory
./configure make sudo make install
Now I have some tools to manipulate LiDAR las files at home. To convert all my las files to text:
#!/bin/bash for f in $( ls *.las); do las2txt --parse xyzinrcpM --header pound $f $f.txt done
There are more elegant ways to implement this, but it gets the job done… . Now I have a few hundred text files with all the relevant LiDAR info in them for manipulation. Since working with the whole enchilada in PostGIS failed, I think I’ll approach it by loading one section at a time, analyzing, and removing those data from the table. More to come… .