GRASS GIS: raster, vector, and imagery analysis

GRASS Example Image: Flow Accumulation

Hydrologic Modeling
GRASS Example Image: Surface Curvatures
Surface Curvatures
GRASS Example Image: 3D Visualization of Data
3D Visualization

A geographic information system (GIS) can be useful tool for describing, modeling, and interpreting data with a spatial reference. Unlike commerical GIS software GRASS is free, and its source code is available for review, modification, or updating. Soil scientists interested in using digital soil survey information, or 3D visualization of soils data may find that GRASS will fit their needs. In addition, GRASS contains hundreds of raster-based operations, such as DEM creation, hydrologic modeling, and solar radiation modeling.

If you have a Macintosh, installation of GRASS is quite simple. Please see Lorenzo Moretti's great website with instructions and an installer: here.

If you have a UNIX machine it is possible to install a pre-compiled binary version of GRASS. However, not every source has an updated version. For the latest updates, I would suggest compiling GRASS from source code.

GRASS6 in a nutshell by Markus Neteler
Links from OSS for Soil Scientists Class

Compiling from Source Code (notes)

The GRASS Logo

The official instructions from the GRASS Wiki can be found here. My notes on compiling (specific to my requirements) are listed below. Note that you will need to adjust the flags in the configure steps according to your system.

Special Note when compiling GDAL with the Lizzard Tech DSDK:

  • be sure to use the full path to the DSDK, i.e. don't use shell expansion shortcuts (thanks Hobu)
  • A simple modification to the DSDK source must be performed for a successful build (see notes...)


  1. compile the source (see notes below)
  2. on debian this isn't too hard, as most of the dependancies can be met with an 'aptitude install xxx-dev'

  3. get a sample dataset either from the GRASS website, or I would be happy to provide you with some stuff from yolo county (which can of course be obtained for free, from
  4. the GRASS Book is a helpfull reference, but it is based on the older branch of GRASS, version 5.3|5.4 ... and thus the vector model is a little out of date. The GRASS mailing list also a great place to get help.

Here are some of my notes on installing it from source:

  1. Install some required packages: these are the names for Debian-based systems
    • compiler and build tools: gcc g77 g++ flex bison m4 make libcfitsio-dev
    • database includes: libsqlite3-dev libmysqlclient14-dev libwrap0-dev (postgresql-includes)
    • GDAL-specific:libhdf4g-dev python-numeric
    • GRASS-specific: libncurses5-dev zlib1g-dev libreadline5-dev libfreetype6-dev libtiff4-dev libpng3-dev tcl8.3-dev tk8.3-dev (glutg3-dev libgle3-dev libgl1-mesa-dev) or (nvidia-glx-dev) fftw-dev
    • requires: libxmu-headers libxmu-dev libxxf86vm-dev libxmuu-dev libxmu-dev
  2. Get and install PROJ4 from source (simple compile)
    • Don't forget the file: unzip in the 'nad' folder before the ./configure step.
  3. Get and install the latest GEOS from source (simple compile)
  4. Get and install the latest GDAL and GDAL-GRASS plugin (relatively simple compile)
    ## 32-bit mode i7:
    export CHOST="i686-pc-linux-gnu"
    export CFLAGS="-O2 -pipe -march=native -msse4 -fomit-frame-pointer"
    export CXXFLAGS="${CFLAGS}" 
    ## 64-bit mode PowerPC
    #optimization from powerpc64:
    export CFLAGS="-mpowerpc64 -O2" 
    export CXXFLAGS="-mpowerpc64 -O2"
    export FCFLAGS="-O2 -mpowerpc64"
    export OBJCFLAGS="-O2 -mpowerpc64"
    export FFLAGS="-O2 -mpowerpc64"

    Special Note when compiling GDAL with NetCDF Support:

    I usually use the following HDF4 configure script:

    ./configure --disable-netcdf --disable-fortran --prefix=/usr/local

    I usually use the following GDAL configure script:

    ./configure --with-png=internal --with-jpeg=internal \
    --with-gif=internal --with-libtiff=internal --with-geotiff=internal \
    --with-netcdf --with-hdf4 \
    --without-ogdi --with-python --with-sqlite3 --with-spatialite \
    --with-threads=yes --without-grass \
    --with-mrsid=/home/dylan/src/Geo_DSDK- --with-opencl --with-curl

    Note that you will need to leave the Lizard Tech DSDK folder in the exact same location for subsequent compilation of GRASS.

  5. Checkout the latest development version of GRASS 6 via Subversion:
  6. # check out the software:
    svn co grass6_dev/
    # or update a local copy:
    svn up

    I usually use the following configure script:

    ./configure --with-tcltk-includes=/usr/include/tcl8.4 \
    --with-postgres --without-odbc --with-mysql \
    --with-mysql-includes=/usr/include/mysql/ \
    --with-freetype --with-freetype-includes=/usr/include/freetype2 \
    --with-readline --with-cxx --enable-largefile \
    --with-postgres-includes=/usr/local/pgsql/include/ \
    --with-postgres-libs=/usr/local/pgsql/lib/ --with-sqlite --with-python \
    --with-proj-share=/usr/local/share/proj/ \
    --with-cairo --with-pthread --with-wxwidgets
  7. Get and install the GDAL-GRASS plugin: GRASS Wiki Page on this topic
  8. Here are the steps that I use:

    #link GRASS libs to the /usr/local/lib/ folder
    cd /usr/local/lib/ ; sudo ln -s /usr/local/grass-6.5.svn/lib/*.so .
    #return to the gdal-grass plugin source directory:
    ./configure --with-grass=/usr/local/grass-6.5.svn/
    make clean && make
    #create a gdal-plug-ins folder
    sudo mkdir /usr/local/lib/gdalplugins
    #copy the compiled plugins to the gdalplugins folder
    sudo cp *.so /usr/local/lib/gdalplugins/
  9. Finally, you might need to update your /etc/ to include GRASS libs
  10. Here is my /etc/ Don't forget to run ldconfig after changing this file!



Adding GRASS vectors to POVRAY Scenes: v.out.pov
GRASS Vectors
povray clipped ssurgo data
SSURGO cutaway
povray slope class cutaway image
Slope class cutaway
PINN Topo-map with POVRAY
McCabe Canyon and Temblor Formation
View of Mt. Defiance from Bear Valley
Near infrared
Over PINN: Geomorphic Features
Geomorphic features.
View of Mt. Defiance from Bear Valley. 2 meter true color
View of Mt. Defiance from Bear Valley, Pinnacles National Monument. Pan Chromatic 2m res.

Thanks to Markus Neteler for initial insight on how to build an appropriate POVRAY script file.

GRASS pre-processing

 # C/O  Markus Neteler
# modified by Dylan Beaudette for UTM projections
# 8/22/2005

#GRASS procedures:

# Note that elevation and overlay must be the same size (dx, dy)

# set the region:
g.region res=4

# dem export cubic-resample from 10m to 4m resolution:
r.resamp.interp method=cubic in=elev_meters out=elev.4m

# reset resolution to match resampled DEM
# this might be required if your DEM does not extend beyond the extent of the resampling step
# r.resamp.interp will automatically extend the region when performing resampling, if the raster is large enough
# g.region rast=elev.4m
r.out.pov map=elev.4m tga=elev.4m.tga

# overlay export
r.out.png in=ikonos.rgb out=ikonos.4m.png

# Render with POVRAY:
povray +Ipinn.pov +Opinn.png +D +P +W800 +H800 +A


 # include ""
# include ""
# include ""
# include ""

# declare Sun = color rgb <0.8, 0.8, 0.8>;
# declare Scale = 1;

light_source { < 667167, 70000, 4040564 > color White }

fog {
 fog_type 2
 fog_offset 50
 fog_alt 65
 color rgb <0.80, 0.88, 0.82>
 distance 80
 up y }

camera {
   //looking at Mt. Defiance from Bear Valley
   location <667101.25310945, 600, 4042753.78420398 > //easting, elev, northing
   look_at < 664250.03420398, 618, 4037343.47947761 > //easting, elev, northing
   //looking at McCabe Canyon
   //location < 667259.00497512, 2000, 4036823.48258706 > //easting, elev, northing
   //look_at < 665225.75870647, 300, 4041018.51368159> //easting, elev, northing
   angle  60 // vertical camera aperture

height_field  {
   tga "elev.4m.tga"
   texture {
      pigment {
          image_map { // image is always projected from -z, with front facing +z, top to +Y                    
             png "ikonos.4m.png"        //our overlay image    
//           png "tci.4m.png"        
          rotate x*90 // align map to height_field
   finish {
          ambient 0.2         // Very dark shadows
          diffuse 0.8         // Whiten the whites
          phong 0.2          // shiny    
          phong_size 100.0    // with tight highlights
          specular 0.5
          roughness 0.05
   scale < 9270, 65535, 9395 > // dx,dz,dy of extent in meters
   translate  < 660575, 0, 4036730> // west_corner, 0, south_corner

    pigment {planar colour_map{[0,rgb <0.9,0.9,1>][1, rgb <0.1,0.2,1>]}}
    scale 2 //avoid repetion
    translate 300  //dx, dz, dy

Using QGIS as an alternate GUI for GRASS

QGIS directly accessing GRASS data
QGIS Attribute Table
Accessing attribute data in QGIS
The QGIS GRASS Toolkit

My Notes for an Install on Debian/Linux
Note that any old gdal-include files or libraries will cause problems in the make step.

Getting Started

  • install the latest GEOS for the latest build of QGIS.
  • install the xorg-dev and xserver-xorg-dev packages

Related Links:

My build method: (you might want a different branch)

# checkout source
svn co qgis-unstable
cd qgis-unstable/

# configure to use GRASS and a custom install of PostgreSQL
# don't build bindings... something weird with debian/unstable
cmake -D GRASS_PREFIX=/usr/local/grass-6.5.svn/ -D WITH_BINDINGS=OFF -D POSTGRES_CONFIG=/usr/local/pgsql/bin/pg_config .

# if you have a multiprocessor machine, use a parallel build
make -j4
sudo make install

Some troubleshooting tips:

  • Run ldd /usr/local/bin/qgis | grep "not found" to test for missing libs.. if there are any missing libs, try running sudo ldconfig to fix things.
  • To add a custom SRID to your QGIS install, add the folowing to the srs Sqlite db:
    INSERT INTO "tbl_srs" VALUES(9001,'CaSoilResource','aea','GRS80','+proj=aea +x_0=0.0 +y_0=0 +lon_0=-96 +lat_0=40.0 +lat_1=20 +lat_2=60.0 +datum=NAD83 +ellps=GRS80 +units=m +no_defs',100000,9001,0);
  • do NOT compile the source code in the root working copy directory

Importing Various Types of Vector and Raster Data

Generic data import and export commands rely on the GDAL/OGR library to be installed correctly. If you have installed GRASS via a binary package, chances are that these commands will work fine. If you have compiled GRASS and GDAL from source, and have not re-compiled GDAL with GRASS support then raster export via r.out.gdal will not work. See this page for details.

The following examples assume that all of the input data has been projected to a consistant coordinate system, and that an appropriate GRASS location has been created. Command names are in boldface type, input files are red, and the resulting GRASS data are in green. Figures are provided for additional context, and are referenced by input/output file name in the example code.

Vector Import

  • Import a SSURGO survey boundary polygon shape file from the current directory:
  • dsn=. layer=soilsa_a_xxx output=boundary

Raster Import

  • Import a DOQQ file from the current directory:
  • input=doqq.tif output=doqq

Text File Import

  • Import an ascii file of x,y,z records from a GPS:
  • #input_file: meters_x | meters_y | meters_z
    #----- -z in=input_file output=gps_points
  • Import an ascii file with variable delimiters
  • # reformat input text file to single-space delimeted
    awk '{print $1, $2, $3}' classes/summer_2005/SSC105/maps/pit_locations.xy | fs=" " x=1 y=2 out=ssc105s
    # export to ESRI shapefile with projection stored in .prj file
    v.out.ogr -e --v type=point in=ssc105s dsn=ssc105s.shp

GPS Import

  • Import track data from a Garmin Serial GPS. Note that gpsbabel will download points in lon/lat WGS84 format. (gpsbabel must be installed). More ideas here... -t in=/dev/ttyS0 out=gps_tracks format=garmin

Vector Operations

A nice introduction to the GRASS vector model can be found here.

Importing and Exporting from/to a Garmin GPS

Importing and exporting GRASS vector data from/to GPS unit can be done in many ways. The commands and work well for importing track and waypoint data from a GPS. Note however that requires points and track data to be stored in decimal degree format. A simpler approach for dealing with both import and export of vector data from GRASS cab be done with a little bit of awk and the program gpstrans. A few common examples are listed below. Command names are in boldface type, input files are red, and the resulting GRASS data are in green.




# garmin-specific notes here

1. dowload data vis gpsbabel to GPX:
 - import with
 - convert directly to shp with gps2shp:  (source)

Samples from the command line.

# download waypoints from garmin to GPX file
gpsbabel -i garmin -f /dev/ttyS0 -o gpx -F garmin_points.gpx

# download track from garmin to GPX file
gpsbabel -t -i garmin -f /dev/ttyS0 -o gpx -F garmin_track.gpx

# upload to garmin GPS
gpsbabel -i gpx -f garmin_points.gpx -o garmin -F /dev/ttyUSB0


Import all way points with labels starting with "0":

#download all waypoints from gps
gpstrans -dw > all_wpts_5-3.ascii
#output from gpstrans looks like:

Format: UTM  UTC Offset:  -7.00 hrs  Datum[066]: NAD83
W       001     01-MAY-06 22:58                                 12/31/1989 -7:00:00     10      S       0662251 4039089
W       002     01-MAY-06 23:20                                 12/31/1989 -7:00:00     10      S       0661915 4039282
W       003     01-MAY-06 23:36                                 12/31/1989 -7:00:00     10      S       0662224 4039287
W       004     02-MAY-06 00:06                                 12/31/1989 -7:00:00     10      S       0662002 4039768

#import only those which start with "0" into GRASS
awk '{OFS="|"} $2 ~ /^0.?/{print $9,$10,$2}' all_wpts_5-3.ascii | out=new_vector columns="x int, y int, id varchar(3)"

Export from points to Garmin GPS:

#export with a "P" prefix to point names
v.out.ascii spring3 dp=0 | \
awk -F"|" '
OFS="\t"; print "Format: UTM  UTC Offset:  -7.00 hrs  Datum[066]: NAD83"
print "W\tP"$3"     \t28-MAR-07 20:35                         \t12/31/1989 -7:00:00\t10\tS\t0"$1"\t"$2
' > garmin_points.waypts
#upload new waypoints to gps
gpstrans -uw garmin_points.waypts

Export new route to Garmin GPS:

##now make a route file, using points P1 --> P6
v.out.ascii pygmy_route_points \
| awk -F"|" 'BEGIN{OFS="\t"; print "Format: UTM UTC Offset: -7.00 hrs Datum[066]: NAD83\nR\t0\t________________"}
{split($1,easting,"."); split($2,northing,".");
print "W","P"$3,"02-MAY-06 17:19","12/31/1989 -7:00:00","10","S","0"easting[1],northing[1]}' > pygmy_route_points.route
#upload new waypoints to gps
gpstrans -ur pygmy_route_points.route

Raster profile along arbitrary line segments

A simple experiment illustrating the calculation of elevation profiles along arbitrary line segments using a combination of GRASS and R. Fractal dimension was calculated for each profile by spectral decomposition (FFT), as suggested in (Davis, 2002). Note that the elevation profiles are calculated in a rather non-standard way, as elevation is plotted against 3D distance along the original line segment. This means that resulting transect length calculations are sensitive to the resolution of the raster (or distance between sample points), across a rough surface: i.e. smaller grid spacing will result in a longer effective (3D) transect length.


  1. Davis, J.C. Statistics and Data Analysis in Geology John Wiley and Sons, 2002

Sample profile creation I: Vector representation of 2 transects performed by an adventurous soil scientist near McCabe Canyon, Pinnacles National Monument.Sample profile creation I: Vector representation of 2 transects performed by an adventurous soil scientist near McCabe Canyon, Pinnacles National Monument.

Create 2 profiles in GRASS

 # digitize some new lines in an area of interest
v.digit -n map=valley bgcmd="d.rast elev_meters"
v.digit -n map=mtns bgcmd="d.rast elev_meters"

# convert to points -i -v -t in=valley out=valley_pts type=line dmax=50 -i -v -t in=mtns out=mtns_pts type=line dmax=50

# set resolution to match DEM
g.region -pa res=4

# extract elevation at each point
v.drape in=valley_pts out=valley_pts_3d type=point rast=elev_4m method=cubic
v.drape in=mtns_pts out=mtns_pts_3d type=point rast=elev_4m method=cubic

# export to text files
v.out.ascii valley_pts_3d > valley_pts_3d.txt
v.out.ascii mtns_pts_3d > mtns_pts_3d.txt

# start R

Sample profile creation II: Elevation profiles for the paths taken by the soil scientist here.Sample profile creation II: Elevation profiles for the paths taken by the soil scientist here.

Process profile data, and plot in R

# read in the data, note that there is no header, and the columns are pipe delimited
valley <- read.table('valley_pts_3d.txt', header=F, sep="|", col.names=c('easting','northing','elev','f'))
mtns <- read.table('mtns_pts_3d.txt', header=F, sep="|", col.names=c('easting','northing','elev','f'))

# compute the length minus 1 ; i.e. the number of points - 1
# use for the lagged distance calculation
s.valley <- length(valley$easting) - 1
s.mtns <- length(mtns$easting) - 1

# calculate the lagged distance along each component of the 3D vector
dx.valley <- valley$easting[2:(s.valley+1)] - valley$easting[1:s.valley]
dy.valley <- valley$northing[2:(s.valley+1)] - valley$northing[1:s.valley]
dz.valley <- valley$elev[2:(s.valley+1)] - valley$elev[1:s.valley]

dx.mtns <- mtns$easting[2:(s.mtns+1)] - mtns$easting[1:s.mtns]
dy.mtns <- mtns$northing[2:(s.mtns+1)] - mtns$northing[1:s.mtns]
dz.mtns <- mtns$elev[2:(s.mtns+1)] - mtns$elev[1:s.mtns]

# combine the vector components
# 3D distance formula
c.valley <- sqrt( dx.valley^2 + dy.valley^2 + dz.valley^2)
c.mtns <- sqrt( dx.mtns^2 + dy.mtns^2 + dz.mtns^2)

# create the cumulative distance along line:
dist.valley <- cumsum(c.valley[1:s.valley])
dist.mtns <- cumsum(c.mtns[1:s.mtns])

# plot the results
plot(dist.valley, valley$elev[1:s.valley], type='l', xlab='Distance from start (m)', ylab='Elevation (m)', main='Profile')
plot(dist.mtns, mtns$elev[1:s.mtns], type='l', xlab='Distance from start (m)', ylab='Elevation (m)', main='Profile')

Sample profile creation III: Spectral decomposition (by FFT) for each profile.Sample profile creation III: Spectral decomposition (by FFT) for each profile.

Compute fractal dimension of each transect by FFT

 #compute the fractal dimension via [1]
s.valley <- spectrum(valley$elev[1:s.valley])
s.mtns <- spectrum(mtns$elev[1:s.mtns])

# compute linear model of log(power) ~ log(1/frequency)
# use the slope estimate in [2]

# first define the model formuals: for plotting and for lm()
fm.valley <- log(s.valley$spec) ~ log(1/s.valley$freq)
fm.mtns <- log(s.mtns$spec) ~ log(1/s.mtns$freq)

# next compute the least-squares regression lines via lm()
lm.valley <- lm(fm.valley)
lm.mtns <- lm(fm.mtns)

# extract the slope of each regression line:
slope.valley <- coef(lm.valley)[2]
slope.mtns <- coef(lm.mtns)[2]

# compute Df - the fractal dimension from [3]
Df.valley <- (5 - slope.valley) / 2
Df.mtns <- (5 - slope.mtns) / 2

# plot the figures
plot(fm.valley, main=paste("Valley Profile\nFractal Dimension: ", round(Df.valley,3), sep=''), xlab='Log(1/frequency)', ylab='Log(Power)' )

plot(fm.mtns, main=paste("Mountain Profile\nFractal Dimension: ", round(Df.mtns,3), sep=''), xlab='Log(1/frequency)',  ylab='Log(Power)')

Spatial Clustering of Point Data: Spearfish Example

This example uses the 'Partitioning Around Medoids (PAM)' algorithm (Kaufman and Rousseeuw, 2005) to divide a number of point observation into k clusters, based on their spatial attributes only. An extension of this concept can be applied to any type of geographic data, such as terrain attributes.

For a simple comparison of some of the partitioning-style clustering algorithms in R, see this page of demos. For a more complete listing of clustering approaches in R, see the Cluster Task View.


  • Kaufman, L. & Rousseeuw, P.J. Finding Groups in Data An Introduction to Cluster Analysis Wiley-Interscience, 2005

Export xy coordinates for the bugsites from GRASS See attached file at bottom of page.

# export bugsites
v.out.ascii in=bugsites out=bugsites.xy

Load this text file into an R session A simple map can be made by plotting the xy coordinates.

# read in ascii file, and assign column names
x <- read.table('bugsites.xy', sep="|")
names(x) <- c('easting', 'northing', 'cat')

# subset original object, return only x,y cols
y <- data.frame(x[,1:2])
row.names(y) <- x$cat

# simple plot of x,y data
plot(y, pch=3)

Flexclust Example: determining the right number of clustersFlexclust Example: determining the right number of clusters

Use the stepFlexclust function to determine an optimal number of hard classes 5 clusters looks like a good start.

# load cluster package

# figure out a good number of clusters use a range of 2 to 10 clusters, with 20 reps each
s <- stepFlexclust(y, k=2:10, nrep=20)

PAM Example: plot of the 5 clusters, centroids, and connectivityPAM Example: plot of the 5 clusters, centroids, and connectivity

Perform hard classification (clustering) with the PAM algorithm, and plot the results

# 5 clusters in a good number
y.pam <- pam(y, 5, stand=TRUE)

# add the clustering vector back to the original dataframe
y$cluster <- y.pam$clustering

# plot the clusters by color
plot(y$easting, y$northing, col=y$cluster, main="Bugsites Spatial Clustering, 5 classes", cex=0.5, pch=16, xlab="Easting", ylab="Northing")

# add the medoids, they are in the same order as the clustering vector
points(y.pam$medoids, pch=15, col=1:5, cex=1.25)

# connect the original points to the centroids with line segments:
for(i in 1:5)
segments(x0=y.pam$medoids[i,][1], y0=y.pam$medoids[i,][2], x1=y$easting[y$cluster == i], y1=y$northing[y$cluster ==i], col=i, lty=3)

Prepare the data for export to text, and save the clustered data

# add the cluster number to the original dataframe
y$cluster <- y.pam$clustering
y$orig_cat <- as.numeric(row.names(y))

# save as a text file and quit
write.table(y, file='bugsites.clust', row.names=FALSE)

Load clustered data into GRASS as a new set of points called 'bclust' For each cluster, extract those points, and compute a convex hull.

# load clustered points into GRASS in=bugsites.clust out=bclust fs=" " columns='x double, y double, cluster integer, orig_cat integer' skip=1

# zoom to the full extent of the Spearfish dataset
g.region rast=elevation.dem

# there are 5 clusters: show them all, and compute convex hulls
for x in `seq 1 5`
do v.extract --o in=bclust where="cluster=$x" out=bclust_$x
v.hull --o in=bclust_$x out=bclust_hull_$x
d.vect bclust_hull_$x type=boundary fcol=none width=2 col=white
d.vect bclust icon=basic/box fcol=black col=black size=6

Traveling Salesman Approach to Visiting Data-loggers

Optimal Route in 3DOptimal Route in 3D

Several data-loggers in steep terrain; how can we visit all of them in the most efficient manner such that we traverse the least cumulative slope gradient? In a sense this is a traveling salesman style problem, with travel costs defined by slope gradient. The module in GRASS can be used to compute the optimal route between data-loggers, given: 1) a network of lines connecting all points to be visited and 2) transferal of slope cost information to the network. An example approach is listed below. Note that due to a bug in, we need to use one extra step-- details below.


  1. generate a delaunay network between points
  2. densify the line segments along the network (v.split)
  3. convert the network to a 3D, by sampling a DEM along vertices (v.drape)
  4. save the slope of each line segment on the network to its attribute table (
  5. use the squared slope as the travel cost
  6. re-connect the new attribute table from the 3D map, to the original 2D map (v.db.connect)
  7. connect data-logger points to delaunay network (
  8. solve the traveling salesman problem (

Room for Improvement:

  1. generate a cost surface using a combination of slope map and known walkable trails / roads
  2. add support for lines to v.what.rast -- this would allow us to sample a more realistic cost surface, generated with r.cost
  3. fix so that it can use 3D input maps

Optimal Datalogger RouteOptimal Datalogger Route

Implementation in GRASS:

# generate simple network connecting all points
v.delaunay --o -l in=temperature_sites_maybe out=tri

# densify line segments
v.split --o in=tri out=tri_2 length=5

# add categories
v.category --o in=tri_2 out=tri_3 option=add type=line

# convert to 3d, using our elevation model
v.drape --o in=tri_3 out=tri_4 rast=rtk_elev_var_smooth@rtk

# add a table
v.db.addtable map=tri_4 columns="cat integer, cost double"

# use the slope between line segments as the cost map=tri_4 type=line layer=1 option=slope columns=cost

# make all slopes positive
echo "UPDATE tri_4 set cost = cost * cost" | db.execute

# connect table from 3D map to original 2D map:
v.db.connect map=tri_3 table=tri_4

# make a copy, which brings over a copy of the linked table
g.copy --o vect=tri_3,triangulation

# clean-up
g.remove vect=tri,tri_2,tri_3,tri_4

# make a network --o in=triangulation points=temperature_sites_maybe out=net --o operation=connect thresh=10

# check network-- has categories?
v.category net op=report

# solve traveling salesman problem
# does not work with 3D maps (ticket # 406) --o in=net out=salesman acol=cost ccats=1-33

Traveling Salesman Approach to Visiting Data-loggers II

Updated version of a previous attempt to use a traveling-salesman approach to visiting data-loggers in steep terrain. This time we use a vector/raster approach (v.rast.stats) to compute the total "cost" incurred along each segment of a network connecting all of our data-loggers. Although this approach requires more computation time (v.rast.stats is slow), we can generate a more detailed cost surface to model movement. In this case we are using a cumulative-traversed-slope derived cost, with a 90% reduction along a local road.

New Approach:

  1. generate a cumulative cost surface, originating at data-logger locations, based on slope angle (r.slope.aspect, r.cost)
  2. generate a network of lines connecting all data-loggers, along with a local road (v.delaunay, v.digit, v.patch)
  3. modify the cost surface along the local road, such that cumulative cost is reduced 90% within 5m of the road (r.buffer, r.mapcalc)
  4. densify the line segments along the network (v.split)
  5. compute the total cost along each line segment of our network (v.rast.stats)
  6. connect data-logger points to delaunay network (
  7. solve the traveling salesman problem (

Optimal Route version 2: 10-meter contours in grey, network in blue, optimal cycle between data-loggers (yellow) in red.Optimal Route version 2: 10-meter contours in grey, network in blue, optimal cycle between data-loggers (yellow) in red.

Setup the cost surface and network

# generate cost surface based on slope
r.slope.aspect elev=rtk_elev_var_smooth@rtk slope=slope format=percent
r.cost --o -k in=slope start_points=temperature_sites_maybe output=slope_cost

# make a vector of roads within the watershed:
v.digit -n map=roads  bgcmd="d.rast"

# use this to modify the cost surface --o in=roads out=roads use=val val=1

# generate buffer 5 meters from road
r.buffer --o in=roads out=roads_buff distances=5 units=meters

# reduce slope cost to 1% of its value along the buffered road
r.mapcalc "new_cost = if(isnull(roads_buff), slope_cost, (0.01) * slope_cost)"

# generate simple network connecting all points
v.delaunay --o -l in=temperature_sites_maybe out=tri

# combine road network, with delaunay network
v.patch in=tri,roads out=tri_1
v.clean --o in=tri_1 out=tri_1_clean tool=break thresh=1

# densify line segments
v.split --o in=tri_1_clean out=tri_2 length=20

# add categories
v.category --o in=tri_2 out=tri_3 option=add type=line

# add table
v.db.addtable map=tri_3 columns="cat integer"

Compute the total cost along each line segment, solve traveling salesman problem

# better approach to assigning cost, based on cost surface:
# really slow, as it uses vect->rast conversion to compute statistics
# about 3.5 minutes for 588 line features =  features / minute

# set the region to 2m, so that rasterized lines are wider
g.region res=2
v.rast.stats vect=tri_3 layer=1 rast=new_cost colprefix=s
# reset region
g.region res=1

# make network --o in=tri_3 points=temperature_sites_maybe out=net --o operation=connect thresh=10

# check network-- has categories?
v.category net op=report

# solve traveling salesman problem -- does not work with 3D maps
# use regular path-distance cost: --o in=net out=simple_salesman ccats=1-33

# use the total cost surface values / line segment --o in=net out=salesman acol=s_sum ccats=1-33

Optimal Route with cost surface: optimal route in black, cost surface colors: blue = low, red = high.Optimal Route with cost surface: optimal route in black, cost surface colors: blue = low, red = high.

Working with transects

Creating a transect in GRASS: Figure 1
Figure 1: screen plotting.
Creating a transect in GRASS: Figure 2
Figure 2: new vector creation.
Creating a transect in GRASS: Figure 3
Figure 3: points along line.

Transects are often used to better understand local variation in soil properties. Planning and mapping a transect in the office can make time spent in the field more efficient. Sometimes we return from the field with a transect starting point and bearing (CCW from north) of travel. Accurate digitizing of this information can often be time consuming when multiple transects are involved.

The command d.bearing can facilitate rapid planning, visualization, and digitization of transects given the parameters: origin=x,y, theta=bearing, radius=length_of_transect. The following examples illustrate some common methods of working with transect data. Command names are in boldface and comments are in grey. The source code for d.bearing is attached at the bottom of this page. Unarchive in the grass6/display/ directory of the GRASS61-CVS source code directory.

Example 1: Plot a hypothetical transect; bearing 230 degrees, 1000m length.

  • Setup the display:
  • #plot the transect origin point:
    d.vect pedons cats=91 icon=basic/box
    #make a grid for illustative purposes, with origin at the point above:
    d.grid size=100 -t -b origin=665736,4039451
    #add a quick scale bar with the mouse:
    d.barscale -m

  • Plot the transect to the screen. (Figure 1)
  • d.bearing origin=665736,4039451 theta=230 radius=1000 col=blue

Example 2: Make a permanent vector 't1' of the same transect.

  • Replot the transect, this time saving the transect line to a new vector called 't1'
  • d.bearing origin=665736,4039451 theta=230 radius=1000 col=blue output=t1

  • Setup the display again, this time plotting the new transect vector 't1'. (Figure 2)
  • d.erase
    d.grid size=100 -t -b origin=665736,4039451
    d.vect pedons cats=91 icon=basic/box
    #display the new vector 't1' along with its direction
    d.vect t1 col=blue disp=shape,dir

Example 3: Create a new vector of points every 100m along the transect 't1'.

  • Assign unique IDs to the vertices of transect vector 't1'
  • #setup a database connection for vector 't1' to MySQL
    v.db.connect map=t1 driver=mysql database="host=localhost,dbname=test" table=t1 key=cat
    #create a new table in MySQL to store information about vector 't1'
    echo "create table t1 (cat int not null , primary key (cat) )" | db.execute
    #upload a unique ID of each vertex to the new table for 't1' map=t1 option=cat

  • Extract a set of points to 't1_points' every 100m along transect vector 't1'.
  • #extract a new point vector along 't1' interpolating a new point every 100m, saving to 't1_points' -i in=t1 out=t1_points dmax=110
    #the new vector 't1_points' now has two tables associated with it:

    t1_points_1: layer 1 t1_points_2: layer 2
    | cat  |
    |    1 |
    |    2 |
    | cat  | lcat | along            |
    |    1 |    1 |                0 |
    |    2 |    1 | 100.000000000004 |
    |    3 |    1 | 200.000000000007 |
    |    4 |    1 | 300.000000000011 |
    |    5 |    1 | 400.000000000015 |
    |    6 |    1 | 500.000000000019 |
    |    7 |    1 | 600.000000000022 |
    |    8 |    1 | 700.000000000026 |
    |    9 |    1 |  800.00000000003 |
    |   10 |    1 | 900.000000000034 |
    |   11 |    1 | 1000.00000000004 |

  • display our new point vector 't1_points', labeling the new ID numbers stored in layer 2. (Figure 3)
  • d.vect t1_points icon=basic/point size=10 fcol=red col=red disp=shape,cat llayer=2

Raster Operations

Geologic-Scale Erosion


Initial Landform

10 Iterations

100 iterations of mass removal based on preferential flow of water as calculated by r.topidx in GRASS. Notice how landform is cutdown most in stream channels, least at the ridges. The basins between ridges appear to "fill" with sediment near the 50th iteration as the entire landform is lowered to sea level (0m). Absolute change in elevation is visble in the elevation profiles below. Example GRASS commands below. Here is a link to a movie, containing all 100 iterations.

50 Iterations
100 Iterations

Note that this is *not* a direct measure of sediment removal, mearly one means of visualizing the qualitative effects of sediment removal. Actual sediment flow values could possibly be calculated with r.terraflow and some expert knowledge of the area. In addition, sediment transport models such as RUSLE, USPED, and SIMWE will provide a quantitative erosion AND deposition estimate. Details here:

Geologic Erosion: Elevation Profiles

Elevation profiles for selected time steps.

r.topidx in=elev_meters out=ti
for x in `seq 0 1 100`
  do r.mapcalc "e = if( float(elev_meters - float($x*ti)) > 0.0 ,float(elev_meters - float($x*ti) ), 0.0 )"
 r.shaded.relief map=e shade=s1
 d.rast s1
 d.barscale at=0,0
 d.out.png out=e_$x res=1
convert `ls *.png | sort -n -k2 -t"_"` movie.mpg

#make an elevation profile
r.profile in=e100 profile=663451.38153808,4038227.40942676,663719.16510544,4041226.58538121 > e100.profile

Mapping Wifi Networks with Kismet, GDAL, and GRASS

Some simple examples of how to extend data collected from kismet along with gpsd. The prgram gpsmap, bundled with kismet, can create excellent maps of wifi networks, albeit not in a georeferenced format. With some ideas from Matt Perry, a quick read through some python docs, and a little testing I put together a simple python script (kismet2shp) which will convert Kismet XML logfiles into a shapefile (see attached file at bottom of page for the source). Conversion from kismet logfile to KML was demonstrated in the above article by Matt. Simple usage of kismet2shp is summarized:

 # range polygons
./kismet2shp -r -s --file=Kismet-Sep-03-2006-1.xml --outfile=range.shp
# network centers (points)
./kismet2shp -s --file=Kismet-Sep-03-2006-1.xml --outfile=points.shp  

Output can either be in the default latlong WGS84 GCS, or in some user defined projected coordinate system (see source code). Example images from gpsmap and from QGIS demonstrate similarities. The important difference is that by converting Kismet data into a shapfile, it is possible to bring this information into a GIS. A simple example below demonstrates how one can compute an estimated signal power index in GRASS, similar to the interpolated power option in gpsmap. The power density index is calculated as the summation of [ 1/(range^2) ] for each network center.

gpsmap range: network range as visualized with gpsmapgpsmap range: network range as visualized with gpsmap

gpsmap power: interpolated power density from gpsmapgpsmap power: interpolated power density from gpsmap

qgis range: network range shapefiles visualized in QGISqgis range: network range shapefiles visualized in QGIS

GRASS power density: simple index of wifi signal power computed in GRASSGRASS power density: simple index of wifi signal power computed in GRASS

Make simple maps with gpsmap

 #interpolated power map
gpsmap -S 2 -e -n 1 -t -l ssid  -k -p -Q 20 -q 2 -P 30 -z 2 -d 1000,1000 \
-o map1.png Kismet-Sep-03-2006-1.gps
#range map
gpsmap -e -n 1 -t -l ssid  -k -r -R 15 -q 2 -d 1000,1000 \
-o map1.png Kismet-Sep-03-2006-1.gps

Estimate signal power density in GRASS

 #prep work:
#import data from kismet2shp: dsn=. layer=points out=wp
#setup a resonable resolution
g.region vect=wp res=10 -a
#init a constant value surface for distance calculation
r.mapcalc "z = 1"
#init power level map
r.mapcalc "a = 0.000"
#create a list of networks
cats=` wp -c  colum=cat`
#loop thru list of networks
for i in $cats
v.extract in=wp out=wp_i list=$i --o > /dev/null
#get this network's estimated range
range_i=` wp_i -c colum=range`
#compute distance map from network center to estimated range
r.cost -k in=z out=c_i start_points=wp_i max_cost=$range_i --o > /dev/null
#convert distance from net center to power index
r.mapcalc "c_i_f = if( isnull(c_i) != 1,  1/(float(c_i)^2) , null())" > /dev/null
#add to previous power estimation
r.mapcalc "a = if( isnull(c_i_f) != 1, a + c_i_f, a)" > /dev/null
#remove the weakest power levels
r.mapcalc "a = if( a < 0.5,  null(), a)" > /dev/null
#fix colors
r.colors map=a color=gyr

A mobile wifi mapping setup might include:

Planned work:

  • Modifications to kismet-earth to improve merging of multiple .gps files
  • enhanced wifi network center estimation, and power interpolation with GIS tools

Simple Comparision of Two Least-Cost Path Approaches

Least Cost Path Comparision MapLeast Cost Path Comparision Map r.walk in blue, r.cost in orange

Simple comparison between the least-cost path as computed by r.walk vs. r.cost. Default parameters, unity travel friction, and λ = 1 were used with r.walk. Slope percent was used as the accumulated cost for r.cost. Test region similar to that used in a previous navigation example.


  1. r.walk
    • Shorter total path (6.03 km)
    • Tendency to stay mid-slope, and cross large drainage at the gentlest possible position
  2. r.cost
    • Longer total path (7.12 km)
    • Tendency to follow drainages upstream-- not always possible in steep terrain


Prep work see attached files at bottom of page to re-create

# generate cost "friction" maps
# unity friction for r.walk
r.mapcalc "friction = 1.0"
# use slope percent as the friction for r.cost
r.slope.aspect elev=elev slope=slope format=percent

Compute the least-cost path via two methods

# compute cumulative cost surfaces
r.walk -k elev=elev friction=friction out=walk.cost start_points=start stop_points=end lambda=1
# Peak cost value: 14030.894105
r.cost -k in=slope out=cost start_points=start stop_points=end
# Peak cost value: 11557.934493

# compute shortest path from start to end points
r.drain in=walk.cost out=walk.drain vector_points=end
r.drain in=cost out=cost.drain vector_points=end

Vectorize least-cost paths for further analysis -s in=walk.drain out=path_1 -s in=cost.drain out=path_2

# add DB column for length calculation
v.db.addcol path_1 columns="len double"
v.db.addcol path_2 columns="len double"

# how long are the two paths? path_1 option=length column=len units=me
# 6.03 km path_2 option=length column=len units=me
# 7.12 km

Simple map

d.rast shade
d.vect start icon=basic/circle fcol=green size=15
d.vect end icon=basic/circle fcol=red size=15
d.vect path_1 width=2 col=blue
d.vect path_2 width=2 col=orange

Sample the elevation raster at a regular interval

# add a vertex every 100m along each line
# this will create two tables for each
# path_1_pts_1   <-- original atts
# path_1_pts_2   <-- original cat + distance along the original line in=path_1 out=path_1_pts -i dmax=100 in=path_2 out=path_2_pts -i dmax=100

# add column for elevation values
v.db.addcol map=path_1_pts layer=2 columns="elev double"
v.db.addcol map=path_2_pts layer=2 columns="elev double"

# upload elev values for profile graph
v.what.rast vector=path_1_pts raster=elev layer=2 column=elev
v.what.rast vector=path_2_pts raster=elev layer=2 column=elev

# dump profiles to CSV files path_1_pts_2 fs="," > p1.csv path_2_pts_2 fs="," > p2.csv

Least Cost Path Comparision ProfileLeast Cost Path Comparision Profile

Plot an elevation profile for each path

## startup R

## load data from previous steps
p1 <- read.csv('p1.csv')
p2 <- read.csv('p2.csv')

## can't guarantee that the two lines will be in the same direction (a thing?)
## p1 is correct, p2 is reversed

## make a nicer plot
## library(Cairo)
## Cairo(type='png', bg='white', width=800, height=400)

plot(elev ~ rev(along), data=p2, type='l', col='orange', lwd=2, xlab='Travel Distance (meters)', ylab='Elevation Change (meters)')
lines(elev ~ along, data=p1, col='RoyalBlue', lwd=2)
legend(0, 3000, legend=c('r.walk', 'r.cost'), col=c('RoyalBlue', 'orange'), lwd=2)

Using R and r.mapcalc (GRASS) to Estimate Mean Topographic Curvature

Recently I was re-reading a paper on predictive soil mapping (Park et al, 2001), and considered testing one of their proposed terrain attributes in GRASS. The attribute, originally described by Blaszczynski (1997), is the distance-weighted mean difference in elevation applied to an n-by-n window of cells:

Equation 4 from (Park et al, 2001)

where n is the number of cells within an (odd-number dimension) square window excluding the central cell, z is the elevation at the central cell, z_{i} is the elevation at one of the surrounding cells i, d_{i} is the horizontal distance between the central cell and surrounding cell i. I wasn't able to get a quick answer using r.neighbors or r.mfilter, so I cooked up a simple R function to produce a solution using r.mapcalc. The results are compared with the source DEM below; concave regions are blue-ish, convex regions are red-ish. The magnitude and range are almost identical to mean curvature derived from, with a Pearson's correlation coefficient of 0.99. I think that it would be of general interest to add functionality to r.neighbors so that it could perform distance-weighted versions of commonly used focal functions.

Elevation surface (left) and resulting mean curvature estimate (right)Elevation surface (left) and resulting mean curvature estimate (right)


  • Blaszczynski, J. Landform characterization with geographical information systems Photogrammetric Engineering and Remote Sensing, 1997, 63, 183-191
  • Park, S.; McSweeney, K. & Lowery, B. Identification of the spatial distribution of soils using a process-based terrain characterization Geoderma, 2001, 103, 249-272

Implementation and example usage in R

## assumes square pixels
## i.e. EWres = NSres
f <- function(, rast.out, file=paste(rast.out, '.mapcalc', sep=''), n=5, res=1)
        # the matrix must be an odd number of rows / cols
        if(n %% 2 == 0)
                stop('n must be an odd number')
        # init a matrix to evaluate our filter on
        x <- matrix(1:n^2, nrow=n, ncol=n)
        # where is the center of the matrix <- median(c(1:n))
        # generate an offset from the center of the matrix
        the.offset <- c(rev(1 -, 1:(
        # convert into matrix of row and collumn-wise offsets
        # according to GRASS's r.mapcalc syntax
        row.offset.mat <- matrix(rep(the.offset, n), nrow=n, byrow=FALSE)
        col.offset.mat <- matrix(rep(the.offset, n), nrow=n, byrow=TRUE)
        # compute linear distance from center
        # we are assuming the EW and NS resolution is constant
        dist.mat <- sqrt((row.offset.mat * res)^2 + (col.offset.mat * res)^2)
        # generate script
        e <- paste('((',, ' - ',, '[', col.offset.mat, ',' , row.offset.mat, ']) / ', dist.mat, ')', sep='')
        # remove reference to center cell
        e <- e[-median(1:n^2)]
        # generate sum
        e.sum <- paste(e, collapse=' + ')
        # add final syntax, along with final division by number of cells - 1 <- paste('Cs = ( ', e.sum, ') / ', signif((n^2)-1, 2), '.0', sep='')
        # write out to file
        cat(, file=file)

# example usage
f('rtk_elev', rast.out='Cs', n=5, res=1)

Resulting r.mapcalc input for a 5x5 window

Cs = ( ((rtk_elev - rtk_elev[-2,-2]) / 2.82842712474619) + ((rtk_elev - rtk_elev[-2,-1]) / 2.23606797749979) + ((rtk_elev - rtk_elev[-2,0]) / 2) + ((rtk_elev - rtk_elev[-2,1]) / 2.23606797749979) + ((rtk_elev - rtk_elev[-2,2]) / 2.82842712474619) + ((rtk_elev - rtk_elev[-1,-2]) / 2.23606797749979) + ((rtk_elev - rtk_elev[-1,-1]) / 1.4142135623731) + ((rtk_elev - rtk_elev[-1,0]) / 1) + ((rtk_elev - rtk_elev[-1,1]) / 1.4142135623731) + ((rtk_elev - rtk_elev[-1,2]) / 2.23606797749979) + ((rtk_elev - rtk_elev[0,-2]) / 2) + ((rtk_elev - rtk_elev[0,-1]) / 1) + ((rtk_elev - rtk_elev[0,1]) / 1) + ((rtk_elev - rtk_elev[0,2]) / 2) + ((rtk_elev - rtk_elev[1,-2]) / 2.23606797749979) + ((rtk_elev - rtk_elev[1,-1]) / 1.4142135623731) + ((rtk_elev - rtk_elev[1,0]) / 1) + ((rtk_elev - rtk_elev[1,1]) / 1.4142135623731) + ((rtk_elev - rtk_elev[1,2]) / 2.23606797749979) + ((rtk_elev - rtk_elev[2,-2]) / 2.82842712474619) + ((rtk_elev - rtk_elev[2,-1]) / 2.23606797749979) + ((rtk_elev - rtk_elev[2,0]) / 2) + ((rtk_elev - rtk_elev[2,1]) / 2.23606797749979) + ((rtk_elev - rtk_elev[2,2]) / 2.82842712474619)) / 24.0

Visual Comparison of 2 Raster Images

Comparison of 2 raster images
Figure 1

The visual comparison of raster data can be complicated by images that have non-NULL values at each grid cell. Transparency can used to "see" both images at once, but the results can be misleading. An alternative method suggested in the GRASS Book (Neteler and Mitasova, 2003), uses a couple simple raster algerbra tricks to make a "checkerboard" image composite of both input raster files. Figure 1 illustrates a simple comparison between an IKONOS false infra-red image, and a classified vegetation image via this checkerboard approach.
Example Code

#create checker pattern with some raster algebra-fu
r.mapcalc " v=if( (sin(5*row()) + sin(5*col())) > 0.0, null(), veg_11 )"
r.mapcalc " i=if( (sin(5*row()) + sin(5*col())) > 0.0, ikonos.rgb, null() )"
#assign the original colors to the new files:
r.colors map=v rast=veg_11
r.colors map=i rast=ikonos.rgb
#patch them together with r.patch, or view with:
d.rast v
d.rast -o i

Neteler, M. & Mitasova, H. Open Source GIS A GRASS GIS Approach. Kluwer Academic Publishers, 2003.

Watershed Basin Delineation

Multiple approaches to the automatic delineation of watershed boundaries, along with the processing associated with the CalWater 2.2.1 dataset. Note that the output from each algorithm is slightly different, and that CalWater does not always represent physiographicaly correct boundaries. The CalWater 2.2.1 original dataset and converted to shapefile are available for download. See attached DEM clip for examples presented below. Matt Perry has written up a nice tutorial on identifying impervious surfaces within the context of watershed delineation.

Watershed basin delineation by the following approaches:

  1. ArcInfo basin() GRID function
  2. multiscale approach suitible for small to medium input grids. raster output.

  3. GRASS r.watershed command
  4. minimum basin size must be defined suitible for small to medium input grids. raster output.

  5. GRASS r.terraflow command
  6. multiscale approach suitible for massive input grids, fast. raster output.


  • none of these algorithms can deal with hierarchial watershed decomposition
  • removal of small vector features does not ensure perfect basin delineation
  • raw basin output from these algorithms are best further processed some ideas here

watershed 1: comparison of the four methods.watershed 1: 3D comparison of the four methods. calwater (red), basin (black), r.terraflow (white).
watershed 2: 3D comparisonwatershed 2: 3D comparison of the four methods. calwater (red), basin (black), r.terraflow (white).

Pre-processing in GRASS note that these steps can be done in any GIS

# projection settings: +proj=aea +x_0=0.0 +y_0=0 +lon_0=-96 +lat_0=40.0 +lat_1=20 +lat_2=60.0 +datum=NAD83
# set the region and cut out a sample section of the DEM
g.region res=90 -ap
r.mapcalc "a = ca_90m_dem_meters"
#generate a shaded relief map for viz:
r.shaded.relief map=a shade=a.shade

ArcInfo basin()

# start ArcInfo and import our DEM called 'a_grid'
# note that 'a_grid' was exported from GRASS via: r.out.arc in=a out=a_grid
asciigrid a_grid a float
basins = basin(flowdirection(a))

GRASS r.watershed

# run r.watershed, saving basins in 'w.basins'
# note that a threshold of 200 sets the minimum basin size [update this]
r.watershed --o elev=a threshold=200 basin=w.basins

GRASS r.terraflow

# run r.terraflow, saving basins in 'a.swater'
r.terraflow elev=a filled=a.filled direction=a.dir swatershed=a.swater accumulation=a.acc
#set cell values of 0 to NULL, these are regions where calculation was not possible
r.null map=a.swater setnull=0

Post-processing of generated basins in GRASS note that this isn't a true merging of sub-basins, just a rough idea of what it might look like!

# vectorize watershed basins from two of the methods:
# r.terraflow -s in=a.swater out=a_watersheds feature=area
# ArcInfo basin() -s in=arc.basins out=arc_basins feature=area
# remove basins smaller than approx. 200ac.
# r.terraflow
v.clean --o in=a_watersheds out=b_watersheds tool=rmarea thresh=809371
# ArcInfo basin()
v.clean --o in=arc_basins out=arc_basins_clean tool=rmarea thresh=809371

CalWater note that this comes as an e00 file, and must be converted.

# get from
# note that this is an e00 file and needs to be converted
# converted with ArcInfo
import AUTO calw221.e00 calwater
# converted with AVCEimport
~/src/avce00-1.3.0/avcimport calw221.e00 calwater_avc
# ArcMap simple conversion to shapefile and re-projection possible via standard tools
# converted to shapefile polygon
# conversion to shapefile with attributes intact not as simple...
# projection to our spatial ref system via ogr2ogr
# note that this shapfile was created by ArcMap
ogr2ogr -t_srs '+proj=aea +x_0=0.0 +y_0=0 +lon_0=-96 +lat_0=40.0 +lat_1=20 +lat_2=60.0 +datum=NAD83' calwater_aea.shp calwater\ polygon.shp

Visual comparison in GRASS

# cut out small piece of the CalWater vector for analysis out=temp_region
v.overlay ainput=calwater binput=temp_region out=calwater_clip operator=and
g.remove vect=temp_region
## compare the methods:
d.his i=a.shade h=w.basins # r.watershed
d.vect calwater_clip type=boundary col=red width=2 # Calwater
d.vect arc_basins_clean type=boundary col=black width=1 # ArcInfo basin()
d.vect b_watersheds type=boundary col=white width=1 # r.terraflow

Aggregation of CalWater Data

#aggregation of CalWater data: see metadata:
Relevant columms:
      43  HRC           2     2     C     -    Hydrologic Region Code (DWR)
      45  HBPA          2     2     C     -    Hydrologic Basin Planning Area (RWQCB)
      47  RBU           5     5     I     -    Concatenates HR,RB,HU into a single integer
      52  RBUA          6     6     I     -    Concatenates HR,RB,HU,HA
      58  RBUAS         7     7     I     -    Concatenates HR,RB,HU,HA,HSA
      65  RBUASP        9     9     I     -    Concatenates HR,RB,HU,HA,HSA,SPWS
      74  RBUASPW      11    11     I     -    Concatenates HR,RB,HU,HA,HSA,SPWS,PWS
      85  HR            2     2     I     -    Hydrologic Region (1->10)(DWR)
      87  RB            1     1     I     -    Regional Water Qual. Cont. Board (1->9)(RWQCB)
      88  HU            2     2     I     -    Hydrologic Unit (00->~80)(SWRCB)
      90  HA            1     1     I     -    Hydrologic Area (0->9)(SWRCB)
      91  HSA           1     1     I     -    Hydrologic Sub-Area (0->9)(SWRCB)
      92  SPWS          2     2     I     -    Super Planning Watershed (00->~30)(CDF)
      94  PWS           2     2     I     -    Planning Watershed (00->~13)(CDF)
# polygons can be dissolved via ArcMap / PostGIS / use=attr ->

regarding watershed decomposition and limitation C/O Helena Mitasova


r.terraflow has not been designed to output landscape decomposition into
watersheds at a given scale/size but its result is used as an input for
hierarchical watershed decomposition - see here
(and the papers by Arge et al. and Verdin and Verdin)

However, the method used there can still leave you with small internal
subwatersheds (r.watershed will have that too) that cannot be lumped  
any bigger watershed.


Working with Landsat Data

Landsat Example: bands 3, 2, 1
Fig.1 Bands 3,2,1
Landsat Example: bands 4, 3, 2
Fig.2 Bands 4,3,2
Landsat Example: bands 5, 3, 2
Fig.3 Bands 5,3,2
Landsat Example: blend with hillshade
Fig.4 Blend with hillshade
Landsat Example: bands 3, 2, 1 with equalized colors
Fig.5 Histogram equalized 321


Data Sources in order of ease of use

  1. University of Maryland's Global Land Cover Facility
  2. CaSiL Landsat Archive for CA
  3. USGS Global Viewer: free ortho-rectifed imagery

Sample GRASS commands

#display a true color composite image from Landsat TM data:
d.rgb r=landsat_band_3 g=landsat_band_2 b=landsat_band_1
#fuse bands into a single image:
r.composite r=landsat_band_3 g=landsat_band_2 b=landsat_band_1 out=landsat_bands_321
#blend landsat composite image with DEM data
#note that blending will require some tweaking: good results obtained with:
r.blend second=shaded_relief first=landsat_321 out=landsat_shade perc=6
#re-composite the blended results
r.composite landsat_shade.r g=landsat_shade.g b=landsat_shade.b out=landsat_shade.rgb
#create histogram equalized inputs (Fig 5)
r.colors map=landsat_band_1 color=grey.eq
r.colors map=landsat_band_2 color=grey.eq
r.colors map=landsat_band_3 color=grey.eq
r.composite r=landsat_band_3 g=landsat_band_2 b=landsat_band_1 out=landsat_eq_321

Simple comparison between the default output from r.composite on landsat bands 3,2,1 and the enhanced output from i.landsat.rgb.

Landsat Example: bands 3, 2, 1 standard r.composite
r.composite default
Landsat Example: bands 3, 2, 1 composite with i.landsat.rgb
i.landsat.rgb enhanced

LiDAR Data Processing Ideas

Quick copy and paste from the GDAL mailing list. Details to follow.

Markus Neteler 
Today 10:54:38 am
On Thu, Dec 15, 2005 at 08:31:14PM -0700, GeoWiz wrote:
> I was wondering if there are any good open source projects going on that 
> involve processing lidar data.
> Thanks,
> Steve 


you may check GRASS 5.4, it contains the software from Prof Maria Brovelli,
Italy. See the src.contrib/POLIMICO/ directory.

Here the README details:

LIDAR data filtering

Command execution order:

1. s.bspline.reg
2. s.edgedetection
3. s.growing

s.bspline.reg/  : bilinear or bicubic spline interpolation Least quares approach
s.correction/   : DTM correction: error removal (building remainder, vegetation etc)
s.edgedetection/: detection of building edges, vegetation edges, etc.
s.growing/      : filling of previously computed edges

Additional software     : fast sites to raster conversion (replaces

Related Publications:
- GRASS Users conference 2002, Trento, Italy

- Brovelli, Maria A, Cannata, Massimiliano & Longoni, Ulisse M (2004)
  LIDAR Data Filtering and DTM Interpolation Within GRASS.
  Transactions in GIS 8 (2), 155-174.

Note that 5.4-CVS contains some updates which will appear in the
next 5.4.x release of GRASS.

Hope this helps


Advanced Attribute Management with MySQL

Attribute management, aggregation, and editing in GRASS can be greatly improved by storing all attribute data in a relational database such as MySQL. By default GRASS will store the attributes associated with a vector dataset in a DBF table. This style of table can be easily modified by most spread sheet applications, and is conviniently stored in a single self contained file. However, the driver for accessing data stored in a DBF table has limited SQL capabilities. While this is enough for simple attribute management, aggregation and complicated table joins are not possible. For an general overview of GRASS attribute management see the official GRASS Wiki Page.

unixODBC or iODBC may also be used to connect GRASS to MySQL. See this page on how to setup an ODBC-style DSN for MySQL.

To set the default attribute storage system for a given mapset, use the command db.connect . The command v.db.connect can be used to configure the attribute storage system for a given vector file. Note that a valid account must be first setup in MySQL. For example, if a user wanted to store all attribute data for a new mapset into the mysql database called 'test', the following commands would be used:

db.login user=user-name password=my-password driver=mysql database=host=localhost,dbname=test

db.connect driver=mysql database=host=localhost,dbname=test

If an existing mapset has attribute data stored in DBF files, a new connection to a MySQL database is still possible. However, it is best to keep all attribute for a given mapset in a single type of database.

Cartographic Output via GMT

The production of high quality, printable maps is an important component of most geographic analy- sis. While GRASS has not traditionally been suited for the production of printed materials, commands like and more recently a helpful wrapper, make it possible to output high quality Postscript maps. However, the rather limited syntax and capabilities of the approach often result in a map that must be finalized in a DTP (desktop publishing) application such as Inkscape, Scribus, or The Gimp. The map composer functionality found in QGIS may be a good alternative for map production in the near future, but the current version is lacking in terms of flexibility, stability, and the ability to ex- port data in formats such as EPS or PDF. With such limited options for the production of high quality printed materials when using an entirely open source workflow, numerous people have cobbled together the necessary glue required to interface GRASS to the various programs included with the Generic Mapping Tools package.