Smathermather's Weblog

Remote Sensing, GIS, Ecology, and Oddball Techniques

Posts Tagged ‘Shell Scripting’

gdal_calc.py– Raster Calculator using Numby Functions

Posted by smathermather on December 1, 2011

gdal_calc is a python script that makes it easy to do band math and logical operations with gdal/numby. This ranges from basic arithemtic operations to logical functions. And while gdal_calc.py has been around since 2008, it is but is a recent and revelatory discovery for me. I had just today resigned myself to properly learning python so as to use the gdal bindings. But, my laziness may continue unabated. Not learning Python. Now that is lazy.

Anyway, want to do band math? It’s as simple as follows (straight from the python script itself):

# example 1 - add two files together
gdal_calc.py -A input1.tif -B input2.tif --outfile=result.tif --calc="A+B"

# example 2 - average of two layers
gdal_calc.py -A input.tif -B input2.tif --outfile=result.tif --calc="(A+B)/2"

# example 3 - set values of zero and below to null
gdal_calc.py -A input.tif --outfile=result.tif --calc="A*(A>0)" --NoDataValue=0

So, I had little modification to average my set of images from PovRay:

gdal_calc.py -A input.tif -B input2.tif -C input3.tif -D input4.tif -E input5.tif --outfile=result.tif --calc="(A+B+C+D+E)/5"

I might write some bash wrappers to create built in functions for basic things I’ll do repeatedly, like average a set of images based on a directory listing. Of course, a little voice inside says “You should just do that in Python.” We’ll see.

Posted in Analysis, GDAL, Image Processing | Tagged: , , , , | 2 Comments »

Debian Configuration– Tomcat on Boot, revision

Posted by smathermather on August 23, 2011

I revised my startup script for Tomcat to use a non-privileged user for security reasons. I used the following page as my resource:

http://linux-sxs.org/internet_serving/c140.html


#!/bin/sh
# /etc/init.d/tomcat6: start Tomcat6 server.

test -f /opt/apache-tomcat-6.0.32/bin/startup.sh || exit 0

PATH=$PATH:/opt/apache-tomcat-6.0.32/bin/

case "$1" in
start) export JAVA_HOME=/opt/jre1.6.0_26/
logger -t Tomcat6 "Starting Tomcat 6..."
exec su - tomcat -c /opt/apache-tomcat-6.0.32/bin/startup.sh | logger -t Tomcat6
;;
stop) export JAVA_HOME=/opt/jre1.6.0_26/
logger -t Tomcat6 "Shutting down Tomcat 6..."
exec su - tomcat -c /opt/apache-tomcat-6.0.32/bin/shutdown.sh | logger -t Tomcat6
;;
*) echo "Usage: /etc/init.d/tomcat6 {start|stop}"
exit 2
;;
esac
exit 0

Now, perhaps what I should be doing is modifying the skeleton file that’s in my /etc/init.d directory for a truly professional look/setup… .

Posted in Other | Tagged: , , , , | Leave a Comment »

Optimizing Pov-Ray include files– smaller trees with the same effect

Posted by smathermather on January 8, 2010

One limitation I’ve run into in rendering large scenes of trees in PovRay, whether for fun for work is memory usage.  I’ve found that about a half million trees is about as many as I can place on a 32-bit system before running out of memory.  Sometime this year I’ll switch to 64-bits, but in the mean time, I hacked together a solution for reducing my tree include file sizes to save memory.

My include file is called tree.inc.  It has the following format (generated by PovTree):

#declare FOLIAGE = mesh {
triangle{<0.18162394, -0.24128051, 0.030293489>, <0.18287745, -0.23965627, 0.026677>, <0.18274091, -0.23925833, 0.033750653>}
triangle{<0.18287745, -0.23965627, 0.026677>, <0.18274091, -0.23925833, 0.033750653>, <0.18468907, -0.23614006, 0.0350326>}
...
...

#declare TREE = union {
object{FOLIAGE}
}

You’ll notice 8 decimal places for those coordinates.  We probably don’t need that level of precision.  The code that follows is messy, but it does the trick– using printf within awk to reduce the total number of decimal points (3 decimal places in this case).  If memory serves me, we strip all the extra brackets, carrots, and commas with sed, format with awk as a 3 decimal place number, put our brackets, carrots, and commas back where they belong, and the trim out any extra characters with sed before piping to a new include file, tree1.inc.  Oh, and I think I just manually added back on the FOLIAGE declaration, etc..

more tree.inc | grep "triangle" | \
sed -e 's/{/ /g' -e 's/</ /g' -e 's/>/ /g' -e 's/}/ /g' -e 's/,/ /g'| \
awk '{ printf "%0.3f %0.3f %0.3f %0.3f %0.3f %0.3f %0.3f %0.3f %0.3f \n", $2, $3, $4, $5, $6, $7, $8, $9, $10};' | \
awk '{ print "triangle {<", $1, ",", $2, ",", $3, ">, <", $4, ",", $5, ",", $6, ">, <", $7, ",", $8, ",", $9, ">}"    };' | \
sed -e 's/ //g' -e 's/0\./\./g'  > tree1.inc

resulting in:

#declare FOLIAGE = mesh {
triangle{<.182,-.241,.030>,<.183,-.240,.027>,<.183,-.239,.034>}
triangle{<.183,-.240,.027>,<.183,-.239,.034>,<.185,-.236,.035>}
...
...

#declare TREE = union {
object{FOLIAGE}
}

Tree image before:

Tree image after:

What’s the difference in size for the include file?

-rw-r--r--  1 user  group    13M Nov  6 21:52 tree.inc
-rw-r--r--  1 user  group   5.6M Nov  8 00:40 tree1.inc

How far can we go with this?  Well, here’s two decimal places for our coordinates:

It’s starting to get a little chunky here, but it might work for a lot of purposes.  However, we’re reaching the limit of our optimization:

-rw-r--r--  1 user  group   5.2M Nov  8 00:48 tree4.inc

Posted in POV-Ray | Tagged: , | Leave a Comment »

LiDAR processing and analysis in PostGIS (I hope…).

Posted by smathermather on January 1, 2010

Alright, so let’s begin again with some LiDAR processing in PostGIS.  I really liked this discussion of multipoint conversions of LiDAR and other point datasets, but we won’t address making things more efficient here (yet).  Besides, these aren’t just points– we do want to retain their attribution… .

Now, any regular database user will recognize my nubeness reading this post.  If you are that person, please comment.  I’m looking to learn… .

In the LiDAR dataset I have, there are elevations for all points, ground points, vegetation points, building points etc.  I’d like to know about how tall my vegetation is by calculating the difference between the height of the vegetation point and the nearest ground point.  So, I need to know what the nearest ground point it, and difference the two heights.  Since I am interested in the nearest ground point that is as close to directly beneath my vegetation point as possible, we’ll compute nearest neighbor in two dimensions.

I’m going to, once again, take advantage of the resources at BostonGIS, and their Nearest Neighbor function.  Install and use according to their directions… .

So, I’ve downloaded the state LiDAR dataset for an Ohio County, and want to do some processing on it.  It does contain some fields in it’s original LAS form we might want to keep, specifically (looking at our statement to create the table:

CREATE TABLE lidar
(
 x numeric,
 y numeric,
 z numeric,
 intensity integer,
 tnumber integer,
 number integer,
 class integer,
 id integer,
 vnum integer,
)
WITH (OIDS=FALSE);
ALTER TABLE lidar OWNER TO postgres;

I convert from las to csv with the las2txt utility (as before):

#!/bin/bash
x=0
for f in $( ls *.las); do
 x=`expr $x + 1`
 echo $x $f
 las2txt --parse xyzinrcpM -sep komma $f $f.csv
done

I want to copy from CSV using local user permissions.  It’s a little less efficient to do this than to have the database directly read from a CSV file, but it has the advantage that I don’t have to muck about giving my database super user file level permissions on my OS.  With the following steps, I’m doing a sequential test of each step, so while the last step converted all my las files to CSV, I’m still exploring this process with a single file.

So, creating a script file called copy.txt with the following contents

\copy lidar from '/Users/blaa/Documents/GIS/Vinton/OSIP_VIN_LiDAR/s2035470.las.csv' with csv

We load a single block of LiDAR data in:

psql -U postgres -d Vinton -f copy.txt

(note, we could have used a pipe to accomplish this)

Now to add a geometry field, and populate that with the info in our x and y fields (remember, I don’t want these to be 3D points, but 2D points with a z attribute).

SELECT AddGeometryColumn('public','lidar','the_geom',91023236,'POINT', 2);
UPDATE lidar SET the_geom = ST_SetSRID(ST_MakePoint(x,y),91023236);

It’d be good to have a spatial index and primary key… .

CREATE INDEX lidar_the_geom_idx ON lidar USING gist(the_geom);
ALTER TABLE lidar ADD PRIMARY KEY (vnum);

And view in uDig:

OK.  Now, I now that class 1 is ground and class 5 is vegetation, so, to simplify future processing, I will split this out into two tables (although subselects or views might work just as well here… ).

CREATE TABLE lidar_veg AS
SELECT * FROM lidar WHERE class = 5;


CREATE TABLE lidar_ground AS
SELECT * FROM lidar WHERE class = 1;



CREATE INDEX lidar_veg_the_geom_idx ON lidar USING gist(the_geom);
CREATE INDEX lidar_ground_the_geom_idx ON lidar USING gist(the_geom);
ALTER TABLE lidar_veg ADD PRIMARY KEY (vnum);
ALTER TABLE lidar_ground ADD PRIMARY KEY (vnum);

The nearest neighbor function is “pgis_fn_nn”, and takes a few options.  We’ll try this first with just one point.  The vegetation point is in the middle of these 4.  There are ground points all around:

SELECT pgis_fn_nn(b.the_geom, 20,1,4,'lidar_ground','true','vnum','the_geom') FROM lidar_veg AS b WHERE b.vnum = 576455;
pgis_fn_nn    
------------------
 (576457,4.31045)
(1 row)

Now, I need to capture that point, find out it’s z coordinate, and subtract that from the z coordinate of my vegetation point… .  Not tonight… .

Posted in Database, PostGIS, SQL | Tagged: , , , , , , | 20 Comments »

GDAL Contours (cont.)

Posted by smathermather on December 21, 2009

Well, I made some mistakes in the last post, not the least of which is I used the wrong flag for creating an attribute field with elevation.  What follows is a little more sophisticated.  It takes us from a series of DEM tiles from which I want 2-foot and 5-foot contours (using gdal_contour), and then dumps those shapefiles into PostgreSQL using shp2pgsql.

First we prep the database.  Again, I used pre-built binaries from Kyng Chaos.

I created a database called “Vinton” using my favorite PostgreSQL management tool, PGAdminIII (actually, it’s the only one I’ve ever tried…).

So, let’s make Vinton spatially aware:

sudo su - postgres -c '/usr/local/pgsql/bin/createlang plpgsql Vinton'
sudo su - postgres -c '/usr/local/pgsql/bin/psql -d Vinton -f /usr/local/pgsql/share/contrib/postgis.sql'
sudo su - postgres -c '/usr/local/pgsql/bin/psql -d Vinton -f /usr/local/pgsql/share/contrib/spatial_ref_sys.sql'

Vinton county is in Ohio State Plane South.  Our dataset is a foot version of the State Plane South HARN.  Looking it up on SpatialReference.org, I get a description not unlike the PostGIS insert statement below, but I changed the EPSG number from 102323 to 1023236 (arbitrarily), and changed the units to feet (from the default for HARN, which is meters):

UNIT["Foot_US",0.30480060960121924]

so that the full insert is:

INSERT into spatial_ref_sys (srid, auth_name, auth_srid, proj4text, srtext) values ( 91023236, 'esri', 1023236, '+proj=lcc +lat_1=38.73333333333333 +lat_2=40.03333333333333 +lat_0=38 +lon_0=-82.5 +x_0=600000 +y_0=0 +ellps=GRS80 +units=m +no_defs ', 'PROJCS["NAD_1983_HARN_StatePlane_Ohio_South_FIPS_3402",GEOGCS["GCS_North_American_1983_HARN",DATUM["NAD83_High_Accuracy_Regional_Network",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",600000],PARAMETER["False_Northing",0],PARAMETER["Central_Meridian",-82.5],PARAMETER["Standard_Parallel_1",38.73333333333333],PARAMETER["Standard_Parallel_2",40.03333333333333],PARAMETER["Latitude_Of_Origin",38],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","1023236"]]');

Now let’s create some tables to house our future contour data:

CREATE TABLE contours_2
(
 gid serial NOT NULL,
 id integer,
 elev numeric,
 the_geom geometry,
 CONSTRAINT contours_2_pkey PRIMARY KEY (gid),
 CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2),
 CONSTRAINT enforce_geotype_the_geom CHECK (geometrytype(the_geom) = 'MULTILINESTRING'::text OR the_geom IS NULL),
 CONSTRAINT enforce_srid_the_geom CHECK (st_srid(the_geom) = 91023236)
)
WITH (OIDS=FALSE);
ALTER TABLE contours_2 OWNER TO postgres;

CREATE TABLE contours_5
(
 gid serial NOT NULL,
 id integer,
 elev numeric,
 the_geom geometry,
 CONSTRAINT contours_5_pkey PRIMARY KEY (gid),
 CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2),
 CONSTRAINT enforce_geotype_the_geom CHECK (geometrytype(the_geom) = 'MULTILINESTRING'::text OR the_geom IS NULL),
 CONSTRAINT enforce_srid_the_geom CHECK (st_srid(the_geom) = 91023236)
)
WITH (OIDS=FALSE);
ALTER TABLE contours_5 OWNER TO postgres;

And finally, we can loop through all of our files, and dump the results first to a shapefile, then pipe that from shp2pgsql directly into the database:

#!/bin/bash
x=0
for f in $( ls *.txt); do
x=`expr $x + 1`
echo $x $f
echo Two Foot:
gdal_contour -i 2 -a elev $f $f.shp
/usr/local/pgsql/bin/shp2pgsql -s 91023236 -a $f.shp contours_2 | /usr/local/pgsql/bin/psql -U postgres -d Vinton
echo Five Foot:
gdal_contour -i 5 -a elev $f 5foot/$f.shp
/usr/local/pgsql/bin/shp2pgsql -s 91023236 -a 5foot/$f.shp contours_5 | /usr/local/pgsql/bin/psql -U postgres -d Vinton
done

Now, I’ll probably convert all those numeric contour fields to integer with a cast statement, but tonight, I let the little laptop do the work… .

Posted in Database, GDAL, PostGIS, SQL | Tagged: , , , , , , | Leave a Comment »

GDAL Contours

Posted by smathermather on December 20, 2009

Just another quick vignette.  From the Ohio Statewide Imagery Program (OSIP) there is a 1-meter DEM for all of Ohio.  To get contours from this dataset, one approach is to use GDAL tools, i.e. gdal_contours.  As I’m working on a Mac today, I used Kyng Chaos pre-compiled Frameworks:

http://www.kyngchaos.com/software:frameworks

Then I needed to update my path variable in the BASH shell:

export PATH=/Library/Frameworks/GDAL.framework/Programs:$PATH

Now we can loop through, again, inelegantly (we’ll get unwanted extensions, i.e. *.txt.shp) and create 1-ft contours for a whole county.  Why 1-foot (the OSIP dataset is meant for 2-foot contours).  Well, 1-foot so that later if I want 5-foot contours, I can have them… .

#!/bin/bash
x=0
for f in $( ls *.txt); do
 x=`expr $x + 1`
# echo number of file we are on + name of file
 echo $x $f
# perform analysis, output to *.shp
 gdal_contour -i 1 -nln elev $f $f.shp
done

And the stdout and stderr:

1 S1890435.txt
0...10...20...30...40...50...60...70...80...90...100 - done.
2 S1890440.txt
0...10...20...30...40...50...60...70...80...90...100 - done.
3 S1890445.txt
0...10...20...30...40...50...60...70...80...90...100 - done.
4 S1890450.txt
0...10...20...30...40...50...60..

etc...

Posted in GDAL | Tagged: , , , , | Leave a Comment »