Notice:
This email and any attachments may contain information that is
personal, confidential,
legally privileged and/or copyright. No part of it
should be reproduced, adapted or communicated without the prior written consent
of the copyright owner.
It is the responsibility of the recipient to check for and remove viruses.
If you have received this email in error, please notify the sender by return email, delete it from your system and destroy any copies. You are not authorised to use, communicate or rely on the information contained in this email.
Please consider the environment before printing this email.
Notice:
This email and any attachments may contain information that is
personal, confidential,
legally privileged and/or copyright. No part of it
should be reproduced, adapted or communicated without the prior written consent
of the copyright owner.
It is the responsibility of the recipient to check for and remove viruses.
If you have received this email in error, please notify the sender by return email, delete it from your system and destroy any copies. You are not authorised to use, communicate or rely on the information contained in this email.
| Hi, I'm trying to determine all the possible intersections in sets of polygons. The approach I'm using is to load each set of polygon perimeters into a Postgis topology & extract all the faces as geometries. This results in some new polygons lying between the input ones. Which is the nature of this technique- that is not the problem. I then determine the number of original polygons that the new ones overlap, which should go from 0 up. My problem arises in that I'm getting new polygons, derived from the boundaries of the old ones, which are actually between old polygons, but have a non-zero (but very small - like 1e-15) area of overlap - but all they have in common is the boundary line which should have a zero area. So some of my new polygons supposedly overlap an old polygon, but in fact they do (or should) not. My solution at the moment is to reject (treat as a zero area) any overlaps < 1e-15 in area, so the number of overlaps represents the "real" data. It works for my purposes, but is not ideal. Is this a bug? An inevitable result with near zero values & finite precision? Is there a better way of dealing with this issue? Thanks, Brent Wood |
_______________________________________________
postgis-devel mailing list
postgi...@lists.osgeo.org
http://lists.osgeo.org/cgi-bin/mailman/listinfo/postgis-devel
yes, true.. but with a thorough read you might notice that the gdal_retile.py experiment was largely ineffective,
but if you click on the link at the top to the *next post*
you will find the one that really worked well.. in fact, we used that 2nd post in production for months, to great effect.
The trick on one machine was to split to work by some constant, and then make psycopg2 connections for each "bucket."
This worked very well..
Since then I have experimented only a tiny bit with SPARK from the Berkeley Amp Lab for a distributed work load on a Hadoop file system, but that world has no GEOS (yet)
--
Brian M Hamlin
OSGeo California Chapter
Hi,
I have a dataset with start & end points that we are represtnting as linestrings. Many records do not have endpoints - so we are takimg the median bearling for comparable startpoints & creating an arbitrary endpoint.
While it is easy to get a bearing for a known endpoint with ST_Azimuth(), is there a simple way in Postgis to get the point at a defined distance & bearing? I've perused the docs & nothing jumped out as an obvious solution.