Looking at the puget floodplain layer, someone has processed it to have
only two multi-polygons, one for the A zone (100-yr) and one for the
X500 zone (500 yr), and a single attribute field with these zones codes.
The xml element is set up as Boolean, no hasValueAttribute field. And
the model is expecting 0,1...
(defmodel floodplains 'floodService:Floodplains
"Presence of a floodplain in given context"
(classification (ranking 'floodService:Floodplains)
0 'floodService:NotInFloodplain
1 'floodService:InFloodplain))
So it looks to me like it ignores the attribute table all together and
lumps the 100 and 500 yr floodplains together (?!)
So clearly the California layer won't do as is.
Looks to me like it would be handy to have an ArcObjects (or GRASS)
script to process the Q3 shapefiles into the form needed, so you get a
uniform product from one region to another. I don't think casual GIS
users will be able to figure this out unless they have a detailed set of
instructions.
Tara
--
Tara Athan
Owner, Athan Ecological Reconciliation Services
tara_athan at alt2is.com
707-272-2115 (cell, preferred)
707-485-1198 (office)
249 W. Gobbi St. #A
Ukiah, CA 95482
Ken
Quoting Tara Athan <tara...@gmail.com>:
> To unsubscribe from this group, send email to
> aries-project+unsubscribegooglegroups.com or reply to this email with
> the words "REMOVE ME" as the subject.
--
Ken Bagstad, Ph.D.
Postdoctoral Associate
Project Economist, BLM-USGS Ecosystem Services Valuation Pilot
Gund Institute for Ecological Economics
University of Vermont
617 Main Street
Burlington, VT 05405
(802) 656-4094
Thanks for catching the duality in the puget data Tara - I had no
idea. Otherwise, having multiple files with different attributes for
the same concept is not a problem as long as the semantics is right -
the system will select the proper one and the necessary
transformations. Obviously for something as obvious as floodplains, a
single vector layer would be the ideal situation....
f
Ken
Before I go through all the layers in geoserver and paste in the
appropriate keywords, I thought I would check:
1) is this really the syntax you want?
2) is this all the information that would be needed to reconstruct the
xml file from geoserver metadata
(I noticed on coverages that I had to determine the XRangeMax and
YRangeMax by other means, but perhaps there is a way to get this info
from a query to Geoserver?- the Geoserver documentation pages on WCS
are all blank)
3) do you have any objections to my inserting these keywords?
My purpose is to get the existing geoserver layers "harmonized" with the
xml file so that when new users, such as myself and the students in your
class, insert new layers we are more likely to add the appropriate
keyword phrase if there are lots of good examples to draw from. And
secondarily, to have the geoserver in good shape for the day when the
xml file will be generated automatically.
The output of the java program is attached.
Tara
PS. I used the javax.xml.parsers.DocumentBuilder, but it doesn't seem to
have a lot of functionality. Have you found an XML parser that is better
than this one?
In terms of reading XML, WCS and WFS, there's a lot of classes to simplify
that in thinklab - you may want to look at XMLDocument.java (simplified
read/write) and XML.java (functional-style XML creation). I know you know
how to use the find command in linux :)
WFS is handled well by geotools through their feature interface, but WCS
isn't so there is code in geospace to read WCS capabilities, too. All the
info about x/y ranges, coordinates and projections are of course there. The
code that reads that stuff is in WCSDatasource.java, and it uses the
XMLDocument class so you can use it as an example. It actually contains an
addOpalDescriptor function that builds the XML from a coverage read from
WCS.
The WCSToOPAL.java and WFSToOPAL.java classes implement thinklab commands to
go W{F|C}S -> xml observation specs. The WFSToOPAL command has an annotate()
function that does that interactively. Working with this stuff outside of
thinklab may be a challenge, but implementing thinklab commands is fairly
simple using Java annotations (see WFSToOpal for an example).
I made lots of changes in the codebase and now ARIES runs off a real spatial
database instead of a catalog initialized from XML (the db is initialized
asynchronously and on request from the xml files - basically exporting an
XML kbox to a postgis one, if you're curious about how that's done look in
the aries.administration plugin for the load-data.clj file, and it's a
pretty big box you'll be opening), so if you try to run anything at your end
and it doesn't work that's why. I also implemented a sophisticated gazetteer
that can be initialized with shapefiles - if you have any locations
(polygons only) that you would like to show up when you type their names in
ARIES, send to me please. It can be seen now in the online application.
Enough confusing info for now?
Ciao ferd
---
Ferdinando Villa, Ph.D.
Research Professor, Ecoinformatics Collaboratory
Gund Institute for Ecological Economics, University of Vermont
http://ecoinformatics.uvm.edu
Is the Geoserver REST extension installed? If so, what am I missing?
Tara
Apparently you can do lots through it but I'm unsure if you can modify
keywords. Anyway,
see if you can get a few requests in - it looks like a powerful means
for programmatic
configuration anyway.
Ferdinando
Quoting Tara Athan <tara...@gmail.com>:
so we could use this to GET just the xml for one layer at a time.
Would there be any problem with that? (assuming I do the work of putting
it in).
Tara
Thanks f
-----Original Message-----
From: aries-...@googlegroups.com [mailto:aries-...@googlegroups.com]
On Behalf Of Tara Athan
Tara
To unsubscribe from this group, send email to