california fema layer

4 views
Skip to first unread message

Tara Athan

unread,
Mar 18, 2010, 9:19:55 PM3/18/10
to aries-...@googlegroups.com
I was just checking up on Mark's California floodplain layer that we
added to Geoserver for practice the other day. It turns out it is the
original Q3 shapefile from FEMA, including polygons for lots of other
things such as COBRA, quad areas and so on, not just floodplain zones.

Looking at the puget floodplain layer, someone has processed it to have
only two multi-polygons, one for the A zone (100-yr) and one for the
X500 zone (500 yr), and a single attribute field with these zones codes.
The xml element is set up as Boolean, no hasValueAttribute field. And
the model is expecting 0,1...
(defmodel floodplains 'floodService:Floodplains
"Presence of a floodplain in given context"
(classification (ranking 'floodService:Floodplains)
0 'floodService:NotInFloodplain
1 'floodService:InFloodplain))

So it looks to me like it ignores the attribute table all together and
lumps the 100 and 500 yr floodplains together (?!)

So clearly the California layer won't do as is.

Looks to me like it would be handy to have an ArcObjects (or GRASS)
script to process the Q3 shapefiles into the form needed, so you get a
uniform product from one region to another. I don't think casual GIS
users will be able to figure this out unless they have a detailed set of
instructions.

Tara

--
Tara Athan
Owner, Athan Ecological Reconciliation Services
tara_athan at alt2is.com
707-272-2115 (cell, preferred)
707-485-1198 (office)
249 W. Gobbi St. #A
Ukiah, CA 95482

Kenneth Joseph Bagstad

unread,
Mar 18, 2010, 10:13:20 PM3/18/10
to aries-...@googlegroups.com, Tara Athan
Yes, that's absolutely right Tara. I think there are features in
Geoserver to select by attribute, but those may not have been
developed at the time when we needed the floodplain data. So the
options are to go with as-is data and select by attribute (if that's
available in Geoserver) or to consistently prep layers as I did for
Puget Sound. If we go with #2 then obviously a script would be really
nice for consistency's sake and to eventually develop a layer for the
whole U.S.

Ken

Quoting Tara Athan <tara...@gmail.com>:

> To unsubscribe from this group, send email to
> aries-project+unsubscribegooglegroups.com or reply to this email with
> the words "REMOVE ME" as the subject.

--
Ken Bagstad, Ph.D.
Postdoctoral Associate
Project Economist, BLM-USGS Ecosystem Services Valuation Pilot
Gund Institute for Ecological Economics
University of Vermont
617 Main Street
Burlington, VT 05405
(802) 656-4094

Ferdinando Villa

unread,
Mar 18, 2010, 11:10:48 PM3/18/10
to aries-...@googlegroups.com, Kenneth Joseph Bagstad, Tara Athan
We can define filters in the datasource (specified in the XML) and
that would take care of it. Except I haven't implemented anything like
that yet. I'll look into that - the underlying middleware (geotools)
fully supports selection by attribute and the specs in the XML side
would be transmitted to geoserver by the API. So no need to program it
into geoserver itself as long as I find a quick way to specify filters
in the semantic annotation.

Thanks for catching the duality in the puget data Tara - I had no
idea. Otherwise, having multiple files with different attributes for
the same concept is not a problem as long as the semantics is right -
the system will select the proper one and the necessary
transformations. Obviously for something as obvious as floodplains, a
single vector layer would be the ideal situation....

f

Kenneth Joseph Bagstad

unread,
Mar 18, 2010, 11:19:51 PM3/18/10
to aries-...@googlegroups.com, Ferdinando Villa, aries-...@googlegroups.com, Tara Athan
Yup. For many of the national-scale data that are currently tiled and
in different places it would be great to eventually have a single
layer (a few nationwide layers like floodplains especially lend
themselves to this). Perhaps with a small army of work study GIS
students, like the Spatial Analysis Lab maintains???

Ken

Tara Athan

unread,
Mar 24, 2010, 3:37:15 AM3/24/10
to aries-...@googlegroups.com
I cobbled together a totally awful java program (attached) to parse the
information out of common.xml to create the keyword phrase for
geoserver. These are printed out as a pair of lines; the
workspace:layerName and then the keyword phrase.

Before I go through all the layers in geoserver and paste in the
appropriate keywords, I thought I would check:
1) is this really the syntax you want?
2) is this all the information that would be needed to reconstruct the
xml file from geoserver metadata
(I noticed on coverages that I had to determine the XRangeMax and
YRangeMax by other means, but perhaps there is a way to get this info
from a query to Geoserver?- the Geoserver documentation pages on WCS
are all blank)
3) do you have any objections to my inserting these keywords?

My purpose is to get the existing geoserver layers "harmonized" with the
xml file so that when new users, such as myself and the students in your
class, insert new layers we are more likely to add the appropriate
keyword phrase if there are lots of good examples to draw from. And
secondarily, to have the geoserver in good shape for the day when the
xml file will be generated automatically.

The output of the java program is attached.

Tara

PS. I used the javax.xml.parsers.DocumentBuilder, but it doesn't seem to
have a lot of functionality. Have you found an XML parser that is better
than this one?

xml2geoserver.out
xml2geoserver.java

Ferdinando Villa

unread,
Mar 24, 2010, 10:03:31 AM3/24/10
to aries-...@googlegroups.com
Hi Tara, thanks, great stuff. I have never thought of going the other way.
The output is correct. When you feel that you're finished with it I may
actually take your code and make it automatically update geoserver through
its REST interface (or if you want to investigate that, it may take less
time to code that in than to put in all kws by hand).

In terms of reading XML, WCS and WFS, there's a lot of classes to simplify
that in thinklab - you may want to look at XMLDocument.java (simplified
read/write) and XML.java (functional-style XML creation). I know you know
how to use the find command in linux :)

WFS is handled well by geotools through their feature interface, but WCS
isn't so there is code in geospace to read WCS capabilities, too. All the
info about x/y ranges, coordinates and projections are of course there. The
code that reads that stuff is in WCSDatasource.java, and it uses the
XMLDocument class so you can use it as an example. It actually contains an
addOpalDescriptor function that builds the XML from a coverage read from
WCS.

The WCSToOPAL.java and WFSToOPAL.java classes implement thinklab commands to
go W{F|C}S -> xml observation specs. The WFSToOPAL command has an annotate()
function that does that interactively. Working with this stuff outside of
thinklab may be a challenge, but implementing thinklab commands is fairly
simple using Java annotations (see WFSToOpal for an example).

I made lots of changes in the codebase and now ARIES runs off a real spatial
database instead of a catalog initialized from XML (the db is initialized
asynchronously and on request from the xml files - basically exporting an
XML kbox to a postgis one, if you're curious about how that's done look in
the aries.administration plugin for the load-data.clj file, and it's a
pretty big box you'll be opening), so if you try to run anything at your end
and it doesn't work that's why. I also implemented a sophisticated gazetteer
that can be initialized with shapefiles - if you have any locations
(polygons only) that you would like to show up when you type their names in
ARIES, send to me please. It can be seen now in the online application.

Enough confusing info for now?

Ciao ferd

---
Ferdinando Villa, Ph.D.
Research Professor, Ecoinformatics Collaboratory
Gund Institute for Ecological Economics, University of Vermont
http://ecoinformatics.uvm.edu

Tara Athan

unread,
Mar 27, 2010, 4:06:50 PM3/27/10
to aries-...@googlegroups.com
Ferdinando Villa wrote:
> Hi Tara, thanks, great stuff. I have never thought of going the other way.
> The output is correct. When you feel that you're finished with it I may
> actually take your code and make it automatically update geoserver through
> its REST interface (or if you want to investigate that, it may take less
> time to code that in than to put in all kws by hand).
>
I tried looking into the REST interface but without much success.
Following the Geoserver documentation, I tried a GET url:
http://ecoinformatics.uvm.edu/geoserver/web/layers.html
along with several other perturbations, but I just get 404 messages.

Is the Geoserver REST extension installed? If so, what am I missing?

Tara

Ferdinando Villa

unread,
Mar 27, 2010, 4:25:30 PM3/27/10
to aries-...@googlegroups.com, Tara Athan
Oops, no, it wasn't. It is now - sorry.

Apparently you can do lots through it but I'm unsure if you can modify
keywords. Anyway,
see if you can get a few requests in - it looks like a powerful means
for programmatic
configuration anyway.

Ferdinando


Quoting Tara Athan <tara...@gmail.com>:

Tara Athan

unread,
Mar 27, 2010, 5:25:59 PM3/27/10
to Ferdinando Villa, aries-...@googlegroups.com
OK, get is working. Also, the url has to look like

http://ecoinformatics.uvm.edu/geoserver/rest/layers.html

Not sure about keywords, we'll see...
Tara

Tara Athan

unread,
Mar 29, 2010, 3:07:41 PM3/29/10
to aries-...@googlegroups.com
With the common.xml file as it is, I will have to walk through the xml
returned by the REST interface for the whole geoserver structure to
obtain the store name associated with each layer. It would be nice if we
could add an element to common.xml like so:

<geospace:hasRestUrl>http://ecoinformatics.uvm.edu/geoserver/rest/workspaces/wsname/datastores/dsname/featuretypes/fname</geospace:hasRestUrl>
(and similarly for coverages)

so we could use this to GET just the xml for one layer at a time.

Would there be any problem with that? (assuming I do the work of putting
it in).

Tara

Ferdinando Villa

unread,
Mar 29, 2010, 3:43:14 PM3/29/10
to aries-...@googlegroups.com
Hey Tara - no, it would be fine, but the tags map directly to ontologies so
I'd put it in the "metadata" namespace which is only ad-hoc annotations.
Something like <metadata:hasRestUrl> is OK - I'll add the property to the
metadata ontology.

Thanks f

-----Original Message-----
From: aries-...@googlegroups.com [mailto:aries-...@googlegroups.com]
On Behalf Of Tara Athan

Tara

To unsubscribe from this group, send email to

Reply all
Reply to author
Forward
0 new messages