All or part of objects invalid (again)

611 views
Skip to first unread message

Rob Large

unread,
Jan 14, 2013, 9:16:50 AM1/14/13
to mapi...@googlegroups.com
I am aware that this is an issue which has been much discussed on here, but I have searched and cannot find the answer I need.

Currently engaged in a major data cleaning operation, I have been plagued by invalid objects and the error message "All or part of objects invalid". I have employed a number of different strategies to track down the invalid items with varied success. So I am hoping to code a solution which can identify such objects and flag them for correction. So far the onl;y way I have found to identify them is by testing each object in turn, performing a geographical operation which triggers the error, which I then trap. As the dataset is very large, this has become very time consuming.

What I am seeking is a definitive definition of what actually constitutes an invalid object, so I can search for the specific problem rather than trying to mimic the symptom as it were. MapInfo clearly knows what an invalid object is, but I have yet to find any decent definition.

I have come across references to using Select * Where Str$(obj) <> Region, but does this work with multi-part objects?

Bill Thoen

unread,
Jan 14, 2013, 1:05:31 PM1/14/13
to mapi...@googlegroups.com
On 01/14/2013 07:16 AM, Rob Large wrote:
> What I am seeking is a definitive definition of what actually
> constitutes an invalid object, so I can search for the specific
> problem rather than trying to mimic the symptom as it were. MapInfo
> clearly knows what an invalid object is, but I have yet to find any
> decent definition.
>
> I have come across references to using Select * Where Str$(obj) <>
> Region, but does this work with multi-part objects?

Have you tried:

SELECT * FROM table WHERE obj

This tests false for each record you hit that contains an invalid object
and it will skip the record. So this SELECT statement returns all the
good records. Alternatively, you can select all the bad records using
'NOT obj'.

However, Even though I've never picked up a bad record that tested good
using this method, I'm skeptical that it's 100% accurate. There's a lot
of room for pathological corner cases in a MapInfo map feature and I
wouldn't be surprised if you found one.

I'm not sure how it behaves with multipart objects. I don't have any
evidence to suspect that it doesn't, but you could always run it
through the dissaggregting routine and test the individual parts, if you
do encounter problems.

Another way to rid yourself of these annoying broken geometries is to
just not accept AutoCAD files or charge your data supplier $1 or $10 or
whatever for each one you hit that wastes your time ;-) If you could
make that stick and you stay in the business, you would grow very rich.
Data cleaning is often one of the biggest costs of any GIS job, and it's
often underestimated. Especially when dealing with data generated by
other systems that have different opinions about what's important.

- Bill Thoen

Robert Crossley

unread,
Jan 14, 2013, 4:38:16 PM1/14/13
to mapi...@googlegroups.com
I agree with Bill in terms of data cleaning. I have a customised program
that customises editing specifically for farms and paddocks, so have built
up some tests that I use before adding data back to the corporate databases.
Recently I received data from 4 different Arc sites, and not one passed
through the error checking without requiring fixing up.

I have found records that have no objects, so getting rid of them by Select
* from table where Not Obj and deleting the resulting records

The clean functions in MapInfo are not too bad at finding errors, so use
these. (Objects > Clean)

After you get rid of the ones that MI thinks there is no or bad objects, you
can also get empty objects by.

Select * from Table Where Int(ObjectInfo(Obj,21)) = 0

Then there are the multi-polygon regions where one polygon is invalid. I
put this test in a loop, where I test each polygon (up to the max number of
polygons in the table). You could also do this manually.

MaxPolyNumber is found by:
Select Max(Int(ObjectInfo(Obj,21))) From Table

For polynumber = 1 To MaxPolyNumber
Select * From Table Where Int(ObjectInfo(Obj,21)) > polynumber Into
SelPolys
Select * from SelPolys Where Int(ObjectInfo(Obj,21)+polynumber) < 4
Next.

Regards
Rob.
--
You received this message because you are subscribed to the Google Groups
"MapInfo-L" group.To post a message to this group, send email to
mapi...@googlegroups.com To unsubscribe from this group, go to:
http://groups.google.com/group/mapinfo-l/subscribe?hl=en
For more options, information and links to MapInfo resources (searching
archives, feature requests, to visit our Wiki, visit the Welcome page at
http://groups.google.com/group/mapinfo-l?hl=en

Rob Large

unread,
Jan 15, 2013, 9:49:42 AM1/15/13
to mapi...@googlegroups.com
Thanks to you both for your time, however:

Bill, surely your method selects records where there is no object associated, not where there is an object but it is funtionally invalid, I have already tried this to no avail. I agree with your sentiment about data suppliers, unfortunately most of this dataset was produced in-house. The invalid objects I have managed to locate are mostly old and date from a time before we had access to polygonised base mapping data, so were hand digitised against raster base maps. Subsequently they have been modified by an an unknowable amount of processes. If I had a way of finding them I would be able to re-create them from the original sources (maybe).

Rob, it is indeed through use of Objects Clean that I am finding them, but this is problematic, since the routine does not indicate which objects are invalid, only that some are. Also it will only reveal invalid objects in cases where the object geography will be changed by the clean process (i.e. there is something to clean). We also have an issue with insufficient memory, so I am having to write code to subset the data geographically, test with Clean and flag the areas where invalid objects occur. This is a very tedious process and I am spending a lot of time watching a progress bar, before I can even begin to pin down what the precise problem is.

I admit I haven't tested for empty objects yet, but I have run a routine which checks each polygon in all compound objects and this has found no polygons with less than 4 nodes in the dataset.

All of which is why I phrased my question as I did. I want to know precisely what properties of a given object cause MI to regard it as invalid.


Bill Thoen

unread,
Jan 15, 2013, 3:49:54 PM1/15/13
to mapi...@googlegroups.com
On 01/15/2013 07:49 AM, Rob Large wrote:
> Thanks to you both for your time, however:
>
> Bill, surely your method selects records where there is no object
> associated, not where there is an object but it is funtionally
> invalid, I have already tried this to no avail. I agree with your
> sentiment about data suppliers, unfortunately most of this dataset was
> produced in-house. The invalid objects I have managed to locate are
> mostly old and date from a time before we had access to polygonised
> base mapping data, so were hand digitised against raster base maps.
> Subsequently they have been modified by an an unknowable amount of
> processes. If I had a way of finding them I would be able to re-create
> them from the original sources (maybe).
Try dumping the table to a MIF/MID file and see what you get. I'd be
curious to know what drops out too. But I bet a lot of those old invalid
features can't be reproduced with the more modern versions of MapInfo,
because I think there's been some atterntion paid to preventing garbage
from getting past the data intake "pipe". You could try creating some
invalid objects like regions with no area or points with Pline
geometries in a MID / MIF, and see, but I don't think these bad
geometries would be allowed to pass into a tab file nowadays.

I vaguely remember using the logical test "if obj" on tables and
catching the invalid objects as well as the null ones, but I could be
mistaken about that (it was quite a while ago.) However, I think you
could still use the idea by using an expression that forces MapInfo to
evaluate the object, For example, if your table is supposed to contain
only points, with an x range from -109 to -102 for valid points and you
try something like this:

SELECT * FROM mytable WHERE CentroidX(obj) BETWEEN -109 AND -102

you force an evaluation of the object (which should trip if the object
is physically malformed (if it doesn't actually stop dead with an error)
and you also catch the ones that are invalid because they are out of range.

Years ago, This same subject came up on MapInfo-L and I think Jacques
Paris did some directed research on the problem and wrote up his finding
on broken/invalid features, and whatr they looked like. The patterns
were nearly all cases of polygon self-intersection or multi-segment
polylines not actually joined into one object and with nodes apparently
out of sequence.

Andy Harfoot

unread,
Jan 16, 2013, 8:07:59 AM1/16/13
to mapi...@googlegroups.com
Hi Rob,

In my experience, the objects in a table are rarely invalid themselves, but instead the offending object is the product of a geoprocessing operation such as an intersect, and the error is thrown when such an operation is run. This makes the capture and exploration of what constitutes an invalid object tricky.

In the past I have written a number of MapBasic routines that undertake processing on an object by object basis so that the "All or part of objects invalid" error can be trapped and flagged without preventing the remainder of the dataset from being processed. Generally I have resolved these situations by manually creating the result. I've had a quick look at some of these and seem to have found at least one situation that is reproducible:

One type of object that raises a 1448 "All or part of objects invalid" error when a split or clean operation is run on it can be defined as follows:

A polygon whose start and end node is coincident with the start and end node of a hole within it.

I have attached a MIF file that when imported will create a polygon containing a hole with the start and end nodes of the exterior and interior polygons coinciding. Interestingly merely viewing these polygons in MI 11.5 with the 'Show Nodes' option enabled seems to result in sporadic crashes.

Of course, there's no guarantee that this is the only situation that generates this effect, but it is one that can be tested for at least, and may provide pointers to a more general property of these troublesome polygons.

Cheers,

Andy


Rob Large wrote:
--
You received this message because you are subscribed to the
Google Groups "MapInfo-L" group.To post a message to this group, send
email to mapi...@googlegroups.com
To unsubscribe from this group, go to:
http://groups.google.com/group/mapinfo-l/subscribe?hl=en
For more options, information and links to MapInfo resources (searching
archives, feature requests, to visit our Wiki, visit the Welcome page at
http://groups.google.com/group/mapinfo-l?hl=en


-- 
Andy Harfoot

GeoData Institute
University of Southampton
Southampton
SO17 1BJ

Tel:  +44 (0)23 8059 2719
Fax:  +44 (0)23 8059 2849

www.geodata.soton.ac.uk
Error1448_HoleExample.mif

Rob Large

unread,
Jan 16, 2013, 9:36:03 AM1/16/13
to mapi...@googlegroups.com
Thanks again for your time Bill

Neither Not obj, nor evaluation using centroidX have proved useful in selecting out the invalid objects, however having managed to identify one of the invalid objects in a backup of the dataset prior to cleaning, I tried out your suggestion of exporting to MIF and reimporting. This seems to have cleaned the object, in the process causing the number of polygons comprising the object to drop from 185 to 102 and leaving the object apparently valid.

Comparing the two objects (pre- and post- export/import) there are visible differences, several holes having been filled in by the cleaning process (although nowhere near 80 holes have gone), but disaggregating the original object to isolate one of these hole polygons did not reveal a problem with the geometry of the hole.

The structure of the MIF file makes it very hard to analyse what changes have occurred, since the object is too complex and the node count of each of the component polygons changes slightly during import/export. I could spend a lot of time analysing the data, but would probably be none the wiser as to what the problems are. If I manage to find a simpler example I may try again.

All of which leaves me frustrated. Why did the designers not see fit to include a function which could identify and select such invalid geometries? The importer is clearly capable of this, but its functonality is not exposed to us.

Thanks anyway

Eric Blasenheim

unread,
Jan 16, 2013, 9:53:29 AM1/16/13
to mapi...@googlegroups.com
Andy,
 
There is much of what you say here that is essentially true but I think I can expand a bit.
 
The error definitely come from the Object processing code and is basically just a warning, although technically we don't have warnings in MapBasic. The reason I say "warning" is that the overally process does not stop but the error does get set. This makes it difficult to appropriately trap as you say. 
 
The problem is that we don't know for sure that the geometry is invalid only that the geoprocessing found a situation it could not handle.  And depending on the operation, it could be the result of previous calculation so the original object may not have been bad but perhaps its neighbor or an intermediate result in a large set of data.  Generally there is some bad polygon involved. And the situation you have sent is just one of a number of possibilities.
 
As to crashing with display nodes, there is not really any relationship between the geoprocessing code and the display. We don't check polygons on rendering. The only case I can think of is if you use the older method of "Clip Region" feature which uses geoprocessing to clip each feature to the specified clip areas. The default options for this feature, for many years, just uses rendering clipping via Windows but the older one still exists. The other case is the labelling option of "Partial Objects" which uses geoprocessing to produce the part of a geometry in the map view. 
But if you can reproduce this please send the information along. 
Also someone asked if Select * Where Str$(obj) <> Region works with multi-part objects and the answer is definitely yes, although you do have to put quotes on the "Region". You can also use where  Int(ObjectInfo(obj, 1)) =  7  (7 is type region)
 
 
Eric Blasenheim
Pitney Bowes Software

Eric Blasenheim

unread,
Jan 16, 2013, 10:09:39 AM1/16/13
to mapi...@googlegroups.com
Rob
 
The Import from MIF does not do object correctness checking. What it does do is to what we always do when creating a new object or editing an object. Since the MIF format does not specify what is a hole and what it not, the exact same code gets called to analyse the raw data and decide if something is entriely inside another polygon and thus a hole or not which makes it an island. Polygons that overlaps are islands even though they are dirty. 
I don't think  that the code throws away polygons with too few nodes which is sort of a sneaky way to . If the total region 0 nodes it will not create the object in the resulting file.
So I don't know why, other than a corrupted map file, an export/import would fix this.
 
Eric Blasenheim
Pitney Bowes Software
 
The exporting code also has not fancy logic merely dump

Andy Harfoot

unread,
Jan 16, 2013, 9:51:54 AM1/16/13
to mapi...@googlegroups.com
Hi Rob,

Could you share the original object, suitably anonymised if necessary?


Cheers,

Andy

Rob Large wrote:
--
You received this message because you are subscribed to the
Google Groups "MapInfo-L" group.To post a message to this group, send
email to mapi...@googlegroups.com
To unsubscribe from this group, go to:
http://groups.google.com/group/mapinfo-l/subscribe?hl=en
For more options, information and links to MapInfo resources (searching
archives, feature requests, to visit our Wiki, visit the Welcome page at
http://groups.google.com/group/mapinfo-l?hl=en

Rob Large

unread,
Jan 16, 2013, 11:04:42 AM1/16/13
to mapi...@googlegroups.com
My pleasure

Attached is the offending object, which consistently causes the error when using Clean to remove gaps smaller than 0.25 hectares. I have tried Check regions and there are apparently no self intersections, but the error is caused even by checking for gaps and/or overlaps (even if the max gap area is set to zero ). The error is almost instantaneous, so it looks like some kind of checking is taking place. I was a bit surprised by this as I would have thought that Objects Check Regions Gaps 0 "hectare" would be equivalent to doing nothing at all, or at least should not generate any data

I would not rule out ther possibility of a corrupted MAP file, but with a polygon of this complexity I don't have the time or the tools to investigate further. The Export and import seems to have fixed the problem for now, without causing unacceptable changes to the geometry as far as I can tell, but I expect to uncover more...

Thanks

I have noted another problem related to the same processing.

I now have a layer containing regions covering an area of maybe 100,000 hectares, without gaps or overlaps, representing the whole of a project area on which I am working. Initially I digitised an area larger than I needed and then used Erase Outside along with a single polygon of the project boundary to remove any excess data. I then created a blank polygon mask to go around the outside by creating a large rectangle and erasing out the middle using the same boundary polygon.

I would expect, since both were cut using the same polygon (one inside, one out), that there would be a good match between these, but when I combine the dataset and the mask in a single layer Check Regions consistently creates numerous sliver polygons around the edges, suggesting the fit is not perfect. I tried using Clean to close the gaps, but this generates numerous self intersections instead (even though the Clean dialog claims that self-intersections are always removed).

Any comments
BadPoly.zip

Andy Harfoot

unread,
Jan 16, 2013, 11:06:18 AM1/16/13
to mapi...@googlegroups.com
Hi Eric,

I accept that the error / warning could only arise in combination with other geometries as part of a process, and as such would be impossible to predict, however it would be useful to be able to identify all polygons like the example that will cause the warning to be raised regardless of the context. These could then be fixed as part of a data cleaning process and reduce the likelihood of error / warning messages subsequently. Is it possible to share the other possibilities that you mention so that these can be checked for by users?

Unfortunately I don't think I can consistently reproduce the crash. The only thing I can say is that whilst viewing a couple of polygons of the type described, MapInfo (11.5.2) crashed five times, and in all cases the 'Show nodes' option was enabled for the polygon. I wasn't using the clip region controls, nor did I have any labels displayed. It may easily have been my machine 'hiccupping'!

Cheers,

Andy


Eric Blasenheim wrote:
Andy,
 
There is much of what you say here that is essentially true but I think I can expand a bit.
 
The error definitely come from the Object processing code and is basically just a warning, although technically we don't have warnings in MapBasic. The reason I say "warning" is that the overally process does not stop but the error does get set. This makes it difficult to appropriately trap as you say. 
 
The problem is that we don't know for sure that the geometry is invalid only that the geoprocessing found a situation it could not handle.  And depending on the operation, it could be the result of previous calculation so the original object may not have been bad but perhaps its neighbor or an intermediate result in a large set of data.  Generally there is some bad polygon involved. And the situation you have sent is just one of a number of possibilities.
 
As to crashing with display nodes, there is not really any relationship between the geoprocessing code and the display. We don't check polygons on rendering. The only case I can think of is if you use the older method of "Clip Region" feature which uses geoprocessing to clip each feature to the specified clip areas. The default options for this feature, for many years, just uses rendering clipping via Windows but the older one still exists. The other case is the labelling option of "Partial Objects" which uses geoprocessing to produce the part of a geometry in the map view. 
But if you can reproduce this please send the information along. 
Also someone asked if Select * Where Str$(obj) <> Region works with multi-part objects and the answer is definitely yes, although you do have to put quotes on the "Region". You can also use where  Int(ObjectInfo(obj, 1)) =  7  (7 is type region)
 
 
Eric Blasenheim
Pitney Bowes Software
 

Rob Large

unread,
Jan 16, 2013, 11:14:47 AM1/16/13
to mapi...@googlegroups.com
A further thought occurs to me, which might be useful. After several attempts to clean this polygon I finally disaggregated it (leaving the holes in place) and then removed each of the large polygons (four or five of them I think) from the selection & deleted what was left. This fixed the problem, suggesting that the problem does not relate to a hole, but to one or more small outlying polygons external to the main are.

Rob Large

unread,
Jan 16, 2013, 11:35:50 AM1/16/13
to mapi...@googlegroups.com
I have been looking at the example object you posted Andy and I would be interested to understand precisely what it is that causes the error to occur. I imagine that it will be some kind of divide by zero error or similar overflow, in which case why has it not been trapped and circumvented?

Clearly the coincidence of start and end nodes between two nested polygons such as this is an unlikely event, but it is far from impossible with real world data. I can imagine a number of scenarios in which it could occur. Again, why has this very special case not been trapped and circumvented?

I don't very often find myself needing to do this kind of data cleaning, but whenever I do I find the inadequacy of the native tools in MI and the glib opacity of the documentation rather frustrating.

Søren Breddam

unread,
Jan 16, 2013, 4:51:48 PM1/16/13
to mapi...@googlegroups.com

Hi Rob,

 

I tried to disaggregate your data:

Objects Disaggregate Into Table BadPoly Data Parcel_ID=Parcel_ID

Then I tried to query you data:

Select Area(obj, "sq m") from BadPoly where Area(obj, "sq m")<1 into a

Then I tried to delete small areas:

Delete from a

Then I tried to check your data with objects check:

Objects Check From BadPoly Into Table BadPoly Overlap Pen (1,2,0)  Brush (2,16776960,0)

 

Then all data seem OK

 

 

Søren Breddam

Teknik og Miljø

sor...@stevns.dk


Hovedgaden 46

 

 

 

4652 Hårlev

Mobil:

 

+45 28 95 30 34

www.stevns.dk

Telefon:

 

+45 56 57 51 71

 

Søren Breddam

unread,
Jan 16, 2013, 4:57:14 PM1/16/13
to mapi...@googlegroups.com

Well, I also inspected your data bounds. They are wide J

You could tighten your bounds and increase your precision if you optimize bounds from:

Bounds (-7845061.1011, -15524202.1641) (8645061.1011, 4470074.53373)

To

Bounds (401368.121128, 143499.457963) (414643.6721, 156026.932011)

 

Hth

Søren

Bill Thoen

unread,
Jan 16, 2013, 5:03:53 PM1/16/13
to mapi...@googlegroups.com
If you really want to *know* what's in these bad features, there is another option. Depends how interested / annoyed / frustrated you are, and whether you can or want to write a program, but you could download the open source MITAB.dll library, write a short program to open a rogue map feature object directly off the disk, and take it apart bit by bit, right down to the bare metal.

The dll library is available free from this site: http://mitab.maptools.org/

Of course that wouldn't tell you anything about how MapInfo is reading these data sets or how and when it processes them after they're read from disk (and it sounds like there's some issues there too). But the option is there if you want it.

- Bill Thoen

Andy Harfoot

unread,
Jan 17, 2013, 4:33:17 AM1/17/13
to mapi...@googlegroups.com
Yes, the default bounds for British National Grid are unnecessarily large, however MapInfo includes an alternative option using bounds of (0,0) (2000000,2000000) to give 1mm precision to the coordinates stored. This is unlikely to have any bearing on whether the polygon is bad or not, but it's good practise to use these over the default. You can use the 1mm bounds by choosing the 'British National Grid (1mm accuracy)' coordinate system.

Andy


Søren Breddam wrote:

Well, I also inspected your data bounds. They are wide J

You could tighten your bounds and increase your precision if you optimize bounds from:

Bounds (-7845061.1011, -15524202.1641) (8645061.1011, 4470074.53373)

To

Bounds (401368.121128, 143499.457963) (414643.6721, 156026.932011)

 

Hth

Søren



Andy Harfoot

unread,
Jan 17, 2013, 5:11:11 AM1/17/13
to mapi...@googlegroups.com
I disaggregated the large polygon (retaining holes) into 21 parts, and then ran a sequence of Check Objects with different combinations of the 21 parts selected. The 1448 Invalid error is raised if all the parts were checked, but if you remove one of the small parts - a triangle inside a hole, then the error is not raised when the remaining parts are checked together. Interestingly, this triangle touches the largest part of the disaggregated polygon at it's start / end point - I'm beginning to see a pattern here!

I have attached the Exploded parts and offending part as separate tab files.

This seems to fix the problem, as do a number of methods suggested by others, however, I believe that the other approaches are solving the issue by coincidence as the offending part happens to have a small area in this case, rather than through an understanding of what causes the problem.

Interestingly, if you recombine the remaining 20 fragments once the offending one is removed, the resulting polygon has it's start / end nodes in different locations anyway! This leads to two conclusions:

- If you disaggregate and recombine the original polygon without altering it in any way, this may fix the problem as the start / end nodes move (this needs more investigation, but appears to shift the problem elsewhere in the polygon in this case, need to check exactly where and why)

- Even if the bad component of the polygon is fixed, subsequent processing may recreate the problem by causing the start/end node of a polygon to move to a 'bad' position.

As a footnote, MapInfo has crashed four times whilst looking at this polygon this morning!


Cheers,

Andy


Rob Large wrote:
My pleasure

Attached is the offending object, which consistently causes the error when using Clean to remove gaps smaller than 0.25 hectares. I have tried Check regions and there are apparently no self intersections, but the error is caused even by checking for gaps and/or overlaps (even if the max gap area is set to zero ). The error is almost instantaneous, so it looks like some kind of checking is taking place. I was a bit surprised by this as I would have thought that Objects Check Regions Gaps 0 "hectare" would be equivalent to doing nothing at all, or at least should not generate any data

I would not rule out ther possibility of a corrupted MAP file, but with a polygon of this complexity I don't have the time or the tools to investigate further. The Export and import seems to have fixed the problem for now, without causing unacceptable changes to the geometry as far as I can tell, but I expect to uncover more...

Thanks

Andy Harfoot

unread,
Jan 17, 2013, 5:23:47 AM1/17/13
to mapi...@googlegroups.com
Hi Rob,

This is due to the fact that the clipped digitised polygons have more nodes at the edge where they meet the project area boundary compared with the erased rectangle, reflecting the additional subdivisions of the project area into your digitised regions. Everywhere there is an extra node that version of the boundary will be offset from the erased rectangle version, generating a overlap or sliver. This is a result of the finite precision that coordinates can be stored to in a digital environment. ArcGIS allows a coordinate tolerance to be specified that allows this problem to be ignored, but MapInfo doesn't have the same facility.

One solution would be to reconstruct your area of interest boundary by combining all the clipped digitised polygons (thus preserving the additional nodes), and then use the result to erase the large rectangle.

Andy


Rob Large wrote:
I have noted another problem related to the same processing.

I now have a layer containing regions covering an area of maybe 100,000 hectares, without gaps or overlaps, representing the whole of a project area on which I am working. Initially I digitised an area larger than I needed and then used Erase Outside along with a single polygon of the project boundary to remove any excess data. I then created a blank polygon mask to go around the outside by creating a large rectangle and erasing out the middle using the same boundary polygon.

I would expect, since both were cut using the same polygon (one inside, one out), that there would be a good match between these, but when I combine the dataset and the mask in a single layer Check Regions consistently creates numerous sliver polygons around the edges, suggesting the fit is not perfect. I tried using Clean to close the gaps, but this generates numerous self intersections instead (even though the Clean dialog claims that self-intersections are always removed).

Any comments


Andy Harfoot

unread,
Jan 17, 2013, 5:25:15 AM1/17/13
to mapi...@googlegroups.com
And here's the attachment!
MI-L_BadPoly.zip
Reply all
Reply to author
Forward
0 new messages