Smathermather's Weblog

Remote Sensing, GIS, Ecology, and Oddball Techniques

Posts Tagged ‘BASH’

Reprocessing imagery with GDAL and BASH — and then cleaning up afterward

Posted by smathermather on August 13, 2012

I show a simple example today of using gdal tools to change from raster to polygons with Bash.  That’s right.  Still no pythonista here.  I have poked ever so gently at the surface of ruby and rails recently, while watching another coder re-tune his brain to a new set of principles of least surprise.

By the way, for my regular readers, please be patient with me over the next few quarters.  I am endeavoring to undertake a much larger writing project for the next few months, which will leave little time for this blog.  But, hopefully at the end, it will be worth everyone’s while.


#!/bin/bash

for name in $( ls ????_???_gt.tif);
do

basename1=`echo $name | awk -F'.' '{print $1}'`              # Split off non-extension portion of name

if [ -f $basename1".shp" ]
then
echo $basename1".shp exists"
else
echo "Converting " $basename1".shp"

gdal_polygonize.py $name -f "ESRI Shapefile" $basename1".shp"

if [ $? -lt 1 ]
then
echo "Conversion a success."
echo "Removing original file."
rm $name
else
echo "mehmeh.  try again."
fi

fi

done

Posted in GDAL | Tagged: , | Leave a Comment »

Reprocessing imagery with GDAL and BASH — and then cleaning up afterward

Posted by smathermather on June 25, 2012

I just can’t seem to get away from Bash. One day I promise to do the same work with Python, but for now, the following will take your directory of 3-band imagery, extract the first band, and if it succeeds, delete your original file. Seeing as I had almost 300GB of imagery, and very little space left, this kind of housekeeping was necessary (my other two bands were vestiges of a stage that used PNG as a temporary format– but each band was identical).

#!/bin/bash

for name in $( ls ????_???.tif);                                 # e.g. "2765_560.tif"
    do

    basename1=`echo $name | awk -F'.' '{print $1}'`              # Split off non-extension portion of name

    if [ -f $basename1_"1.tif" ]
    then
      echo $basename1_"1.tif" "exists"                           # skip output files if already created
    else
        echo "Converting " $basename1"_1.tif"                    # give a little feedback
        gdal_translate -b 1 $name $basename1"_1.tif"             # use gdal translate to extract just band 1

        if [ $? -lt 1 ]                                          # if error status is < 1 (0), then remove the original
        then
            echo "Conversion a success."
            echo "Removing original file."
            rm $name
        else       
            echo "mehmeh.  try again."
        fi
    fi

done

Posted in GDAL | Tagged: , | Leave a Comment »

gdal_warp, cutlines, and cwhere– simple tip for use on Linux

Posted by smathermather on December 17, 2011

Mini GDAL tip of the day:

gdalwarp, especially in combination with gdal_merge, is a powerful tool for doing all sorts on nice aggregation (read: mosaic’ing) of spatial raster data.  Unfortunately, at least with a google search, there’s very little to be found on demonstrating the use of queries in conjunction with cutlines, probably because in general these queries are not difficult to figure out.

In a previous post, I demonstrated using this to stitch together USGS quads.  I found out when I tried to run this code on a Linux machine, the command like code was a little different.  Instead of:

gdal_merge -o usgs_merge.tif -createonly input1.tif input2.tif ...
gdalwarp -cutline quadrangle_index_24k.shp -cwhere "quadname_ = West_View" West_View.tif usgs_merge.tif

I found I needed some extra single quotes to get my string literal to not be interpreted as a field:


gdalwarp -cutline quadrangle_index_24k.shp -cwhere "quadname_ = 'West_View'" West_View.tif usgs_merge.tif

Ah, that’s better.  Now I have a nice big mosaic of DEMs for my region with perfect cuts where the overlaps used to be.  (The change is hard to see– look for single quotes around West_View in “quadname_ = ‘West_View'”)

Posted in GDAL | Tagged: , , , | Leave a Comment »

Debian Configuration– Tomcat on Boot, revision

Posted by smathermather on August 23, 2011

I revised my startup script for Tomcat to use a non-privileged user for security reasons. I used the following page as my resource:

http://linux-sxs.org/internet_serving/c140.html


#!/bin/sh
# /etc/init.d/tomcat6: start Tomcat6 server.

test -f /opt/apache-tomcat-6.0.32/bin/startup.sh || exit 0

PATH=$PATH:/opt/apache-tomcat-6.0.32/bin/

case "$1" in
start) export JAVA_HOME=/opt/jre1.6.0_26/
logger -t Tomcat6 "Starting Tomcat 6..."
exec su - tomcat -c /opt/apache-tomcat-6.0.32/bin/startup.sh | logger -t Tomcat6
;;
stop) export JAVA_HOME=/opt/jre1.6.0_26/
logger -t Tomcat6 "Shutting down Tomcat 6..."
exec su - tomcat -c /opt/apache-tomcat-6.0.32/bin/shutdown.sh | logger -t Tomcat6
;;
*) echo "Usage: /etc/init.d/tomcat6 {start|stop}"
exit 2
;;
esac
exit 0

Now, perhaps what I should be doing is modifying the skeleton file that’s in my /etc/init.d directory for a truly professional look/setup… .

Posted in Other | Tagged: , , , , | Leave a Comment »

Debian Configuration– Tomcat on Boot

Posted by smathermather on January 24, 2011

Start-up scripts in Debian Linux aren’t exactly straight forward for the un-initiated.  Actually, if memory serves me, they aren’t any more straight forward on Ubuntu Linux either, but such is heredity.

We are transitioning some of our GeoServer instances over to 64-bit Debian Linux.  In my test Ubuntu environment, I had a hack in place to force the Tomcat Java Servlet to launch on startup, but it was a hack and not very deep in understanding.  Here, I will go somewhat deeper.

So, why not just use a package manager to install Tomcat, thus bi-passing the need for setting up my own service?  GeoWebCache’s configuration guide discouraged me:

Due to licensing issues, many Linux distributions come with a variety of Java environments. Additionally, to minimize the chance of a security breach with default settings, most versions of Tomcat are configured to not allow access to the file system. Therefore, it is highly recommended to follow these instructions, even if you already have a servlet container set up.

Well, ok then.  I’ll roll my own.

Following the installation instructions for GeoWebCache is straightforward, but I have no interest in starting Tomcat manually, except when necessary.

Enter, the google:

I modified code from this source:

http://linux.derkeiler.com/Mailing-Lists/Debian/2008-12/msg00099.html

to get something that looks like this as my script to run a Tomcat6 startup shell:


#!/bin/sh
# /etc/init.d/tomcat6: start Tomcat6 server.


test -f /opt/apache-tomcat-6.0.30/bin/startup.sh || exit 0

case "$1" in

start) export JAVA_HOME=/opt/jre1.6.0_23/bin/
logger -t Tomcat6 "Starting Tomcat 6..."
exec /opt/apache-tomcat-6.0.30/bin/startup.sh | logger -t Tomcat6
;;

stop) export JAVA_HOME=/opt/jre1.6.0_23/bin/
logger -t Tomcat6 "Shutting down Tomcat 6..."
exec /opt/apache-tomcat-6.0.30/bin/shutdown.sh | logger -t Tomcat6
;;

*) echo "Usage: /etc/init.d/tomcat6 {start|stop}"
exit 2
;;
esac
exit 0

I still need to deploy this as a service. The first step is to sudo or su root and place this in my /etc/init.d directory.

sudo cp tomcat6.sh /etc/init.d/.

Now this needs loaded as a service.  Normal multi-user run levels for debian are 2-5.  0, 1 and 6 are “System Halt”, “Single User Mode”, and Reboot, so we set the start flag to run on multi-user logins, and to stop in the other run levels, like reboot.

update-rc.d apache2 start 99 2 3 4 5 . stop 99 0 1 6 .

Ah, now it works a charm.  FYI, I did my testing of the script while logged in as root, so I wouldn’t have to restart with every change.  I figured being logged in as root would simulate, pretty closely, the boot environment, relative to being an ordinary user.

Posted in Other | Tagged: , , , , , , | 3 Comments »

LiDAR processing and analysis in PostGIS (I hope…).

Posted by smathermather on January 1, 2010

Alright, so let’s begin again with some LiDAR processing in PostGIS.  I really liked this discussion of multipoint conversions of LiDAR and other point datasets, but we won’t address making things more efficient here (yet).  Besides, these aren’t just points– we do want to retain their attribution… .

Now, any regular database user will recognize my nubeness reading this post.  If you are that person, please comment.  I’m looking to learn… .

In the LiDAR dataset I have, there are elevations for all points, ground points, vegetation points, building points etc.  I’d like to know about how tall my vegetation is by calculating the difference between the height of the vegetation point and the nearest ground point.  So, I need to know what the nearest ground point it, and difference the two heights.  Since I am interested in the nearest ground point that is as close to directly beneath my vegetation point as possible, we’ll compute nearest neighbor in two dimensions.

I’m going to, once again, take advantage of the resources at BostonGIS, and their Nearest Neighbor function.  Install and use according to their directions… .

So, I’ve downloaded the state LiDAR dataset for an Ohio County, and want to do some processing on it.  It does contain some fields in it’s original LAS form we might want to keep, specifically (looking at our statement to create the table:

CREATE TABLE lidar
(
 x numeric,
 y numeric,
 z numeric,
 intensity integer,
 tnumber integer,
 number integer,
 class integer,
 id integer,
 vnum integer,
)
WITH (OIDS=FALSE);
ALTER TABLE lidar OWNER TO postgres;

I convert from las to csv with the las2txt utility (as before):

#!/bin/bash
x=0
for f in $( ls *.las); do
 x=`expr $x + 1`
 echo $x $f
 las2txt --parse xyzinrcpM -sep komma $f $f.csv
done

I want to copy from CSV using local user permissions.  It’s a little less efficient to do this than to have the database directly read from a CSV file, but it has the advantage that I don’t have to muck about giving my database super user file level permissions on my OS.  With the following steps, I’m doing a sequential test of each step, so while the last step converted all my las files to CSV, I’m still exploring this process with a single file.

So, creating a script file called copy.txt with the following contents

\copy lidar from '/Users/blaa/Documents/GIS/Vinton/OSIP_VIN_LiDAR/s2035470.las.csv' with csv

We load a single block of LiDAR data in:

psql -U postgres -d Vinton -f copy.txt

(note, we could have used a pipe to accomplish this)

Now to add a geometry field, and populate that with the info in our x and y fields (remember, I don’t want these to be 3D points, but 2D points with a z attribute).

SELECT AddGeometryColumn('public','lidar','the_geom',91023236,'POINT', 2);
UPDATE lidar SET the_geom = ST_SetSRID(ST_MakePoint(x,y),91023236);

It’d be good to have a spatial index and primary key… .

CREATE INDEX lidar_the_geom_idx ON lidar USING gist(the_geom);
ALTER TABLE lidar ADD PRIMARY KEY (vnum);

And view in uDig:

OK.  Now, I now that class 1 is ground and class 5 is vegetation, so, to simplify future processing, I will split this out into two tables (although subselects or views might work just as well here… ).

CREATE TABLE lidar_veg AS
SELECT * FROM lidar WHERE class = 5;


CREATE TABLE lidar_ground AS
SELECT * FROM lidar WHERE class = 1;



CREATE INDEX lidar_veg_the_geom_idx ON lidar USING gist(the_geom);
CREATE INDEX lidar_ground_the_geom_idx ON lidar USING gist(the_geom);
ALTER TABLE lidar_veg ADD PRIMARY KEY (vnum);
ALTER TABLE lidar_ground ADD PRIMARY KEY (vnum);

The nearest neighbor function is “pgis_fn_nn”, and takes a few options.  We’ll try this first with just one point.  The vegetation point is in the middle of these 4.  There are ground points all around:

SELECT pgis_fn_nn(b.the_geom, 20,1,4,'lidar_ground','true','vnum','the_geom') FROM lidar_veg AS b WHERE b.vnum = 576455;
pgis_fn_nn    
------------------
 (576457,4.31045)
(1 row)

Now, I need to capture that point, find out it’s z coordinate, and subtract that from the z coordinate of my vegetation point… .  Not tonight… .

Posted in Database, PostGIS, SQL | Tagged: , , , , , , | 20 Comments »

GDAL Contours (cont.)

Posted by smathermather on December 21, 2009

Well, I made some mistakes in the last post, not the least of which is I used the wrong flag for creating an attribute field with elevation.  What follows is a little more sophisticated.  It takes us from a series of DEM tiles from which I want 2-foot and 5-foot contours (using gdal_contour), and then dumps those shapefiles into PostgreSQL using shp2pgsql.

First we prep the database.  Again, I used pre-built binaries from Kyng Chaos.

I created a database called “Vinton” using my favorite PostgreSQL management tool, PGAdminIII (actually, it’s the only one I’ve ever tried…).

So, let’s make Vinton spatially aware:

sudo su - postgres -c '/usr/local/pgsql/bin/createlang plpgsql Vinton'
sudo su - postgres -c '/usr/local/pgsql/bin/psql -d Vinton -f /usr/local/pgsql/share/contrib/postgis.sql'
sudo su - postgres -c '/usr/local/pgsql/bin/psql -d Vinton -f /usr/local/pgsql/share/contrib/spatial_ref_sys.sql'

Vinton county is in Ohio State Plane South.  Our dataset is a foot version of the State Plane South HARN.  Looking it up on SpatialReference.org, I get a description not unlike the PostGIS insert statement below, but I changed the EPSG number from 102323 to 1023236 (arbitrarily), and changed the units to feet (from the default for HARN, which is meters):

UNIT["Foot_US",0.30480060960121924]

so that the full insert is:

INSERT into spatial_ref_sys (srid, auth_name, auth_srid, proj4text, srtext) values ( 91023236, 'esri', 1023236, '+proj=lcc +lat_1=38.73333333333333 +lat_2=40.03333333333333 +lat_0=38 +lon_0=-82.5 +x_0=600000 +y_0=0 +ellps=GRS80 +units=m +no_defs ', 'PROJCS["NAD_1983_HARN_StatePlane_Ohio_South_FIPS_3402",GEOGCS["GCS_North_American_1983_HARN",DATUM["NAD83_High_Accuracy_Regional_Network",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Lambert_Conformal_Conic_2SP"],PARAMETER["False_Easting",600000],PARAMETER["False_Northing",0],PARAMETER["Central_Meridian",-82.5],PARAMETER["Standard_Parallel_1",38.73333333333333],PARAMETER["Standard_Parallel_2",40.03333333333333],PARAMETER["Latitude_Of_Origin",38],UNIT["Foot_US",0.30480060960121924],AUTHORITY["EPSG","1023236"]]');

Now let’s create some tables to house our future contour data:

CREATE TABLE contours_2
(
 gid serial NOT NULL,
 id integer,
 elev numeric,
 the_geom geometry,
 CONSTRAINT contours_2_pkey PRIMARY KEY (gid),
 CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2),
 CONSTRAINT enforce_geotype_the_geom CHECK (geometrytype(the_geom) = 'MULTILINESTRING'::text OR the_geom IS NULL),
 CONSTRAINT enforce_srid_the_geom CHECK (st_srid(the_geom) = 91023236)
)
WITH (OIDS=FALSE);
ALTER TABLE contours_2 OWNER TO postgres;

CREATE TABLE contours_5
(
 gid serial NOT NULL,
 id integer,
 elev numeric,
 the_geom geometry,
 CONSTRAINT contours_5_pkey PRIMARY KEY (gid),
 CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2),
 CONSTRAINT enforce_geotype_the_geom CHECK (geometrytype(the_geom) = 'MULTILINESTRING'::text OR the_geom IS NULL),
 CONSTRAINT enforce_srid_the_geom CHECK (st_srid(the_geom) = 91023236)
)
WITH (OIDS=FALSE);
ALTER TABLE contours_5 OWNER TO postgres;

And finally, we can loop through all of our files, and dump the results first to a shapefile, then pipe that from shp2pgsql directly into the database:

#!/bin/bash
x=0
for f in $( ls *.txt); do
x=`expr $x + 1`
echo $x $f
echo Two Foot:
gdal_contour -i 2 -a elev $f $f.shp
/usr/local/pgsql/bin/shp2pgsql -s 91023236 -a $f.shp contours_2 | /usr/local/pgsql/bin/psql -U postgres -d Vinton
echo Five Foot:
gdal_contour -i 5 -a elev $f 5foot/$f.shp
/usr/local/pgsql/bin/shp2pgsql -s 91023236 -a 5foot/$f.shp contours_5 | /usr/local/pgsql/bin/psql -U postgres -d Vinton
done

Now, I’ll probably convert all those numeric contour fields to integer with a cast statement, but tonight, I let the little laptop do the work… .

Posted in Database, GDAL, PostGIS, SQL | Tagged: , , , , , , | Leave a Comment »

GDAL Contours

Posted by smathermather on December 20, 2009

Just another quick vignette.  From the Ohio Statewide Imagery Program (OSIP) there is a 1-meter DEM for all of Ohio.  To get contours from this dataset, one approach is to use GDAL tools, i.e. gdal_contours.  As I’m working on a Mac today, I used Kyng Chaos pre-compiled Frameworks:

http://www.kyngchaos.com/software:frameworks

Then I needed to update my path variable in the BASH shell:

export PATH=/Library/Frameworks/GDAL.framework/Programs:$PATH

Now we can loop through, again, inelegantly (we’ll get unwanted extensions, i.e. *.txt.shp) and create 1-ft contours for a whole county.  Why 1-foot (the OSIP dataset is meant for 2-foot contours).  Well, 1-foot so that later if I want 5-foot contours, I can have them… .

#!/bin/bash
x=0
for f in $( ls *.txt); do
 x=`expr $x + 1`
# echo number of file we are on + name of file
 echo $x $f
# perform analysis, output to *.shp
 gdal_contour -i 1 -nln elev $f $f.shp
done

And the stdout and stderr:

1 S1890435.txt
0...10...20...30...40...50...60...70...80...90...100 - done.
2 S1890440.txt
0...10...20...30...40...50...60...70...80...90...100 - done.
3 S1890445.txt
0...10...20...30...40...50...60...70...80...90...100 - done.
4 S1890450.txt
0...10...20...30...40...50...60..

etc...

Posted in GDAL | Tagged: , , , , | Leave a Comment »

LiDAR data in BASH using libLAS

Posted by smathermather on December 12, 2009

At home, I work on a Mac, so when I want to do work here, my scripting language changes.  In this case, I prefer BASH as my control script.  I downloaded libLAS and did the standard Unix ./configure, make, sudo make install dance in the decompressed libLAS directory

./configure
make
sudo make install

Now I have some tools to manipulate LiDAR las files at home.  To convert all my las files to text:

#!/bin/bash
for f in $( ls *.las); do
 las2txt --parse xyzinrcpM --header pound $f $f.txt
done

There are more elegant ways to implement this, but it gets the job done… .  Now I have a few hundred text files with all the relevant LiDAR info in them for manipulation.  Since working with the whole enchilada in PostGIS failed, I think I’ll approach it by loading one section at a time, analyzing, and removing those data from the table.  More to come… .

Posted in Other | Tagged: , , | Leave a Comment »