Using SQL Server tables in MapInfo Professional

1,995 views
Skip to first unread message

Matt

unread,
Jan 4, 2012, 6:59:03 AM1/4/12
to MapInfo-L
Hi all,

We are currently exploring the idea of storing some or all of our
spatial data in a SQL Server 2008 spatial database.

Obviously we will need to view and edit this data in MapInfo but I
would be interested to hear from anyone who has any advice in terms of
best practice when serving the data corporately via MapInfo Pro.

For the most part, our spatial data is maintained by our specialist
officers in the GIS team so we would have to give some consideration
to the security and editing permissions which the other MI Pro users
would have.

As such, is it best to do away with ‘live’ or ‘linked’ tab files which
enable MI users to view the data that is in the SQL Server database
altogether? The former can run very slowly, especially if looking at
a view in the database and the latter needs to be refreshed in order
to keep the data up to date. Would we be better of creating a set of
flat tab files for corporate users on an hourly/nightly/weekly basis
in order to keep the data current and consistent? How would that be
done efficiently?

I mentioned that we might be using views from the database. This is
to leverage the spatial information in a more intelligent way than has
been done here traditionally. Here are a couple of examples of what
we’d like to do:

We could, for instance, set up a view which takes address information
for missed bin collections from one table (non-spatial - potentially
in another database), links it to our (spatial) address dataset using
the address ID and then aggregates them up spatially to an
administrative area level. Users could then open this tab/view
directly in MapInfo to see a count of missed collections by ward.

Furthermore, given that so many of our datasets are tied to existing
geographies (address points or administrative/statistical boundaries),
I’d like to start maintaining a dataset in a non spatial SQL Server
table and linking this information to the ‘main’ spatial tables to
create views. If, for arguments sake, an address point was in the
wrong place or a parish boundary changed, we would only need to adjust
the ‘main’ spatial table once rather than having to change each and
every table which was based on that geography.

Has anybody done anything like the above proposals and would they like
to offer any advice on best practice or techniques for achieving this?

Many thanks,

Matt

e.j.h.polle

unread,
Jan 4, 2012, 7:25:48 AM1/4/12
to MapInfo-L
Hi Matt,


If you have not already done so, I would recommend you to sign up for
the MUGUKI (MapInfo user Group UK and Ireland) Conference in London
on 26th January 2012.

During this conference a workshop will be held (10.00 - 13.00) on
MapInfo and SQL Server Spatial. Peter Horsbøll Møller will come over
from Denmark to run this workshop.

During this 3 hour session you might get a first answer to your
questions.

More information here: http://www.muguk.com/

HTH,


Egge-Jan

Lars I. Nielsen (GisPro)

unread,
Jan 4, 2012, 9:15:57 AM1/4/12
to mapi...@googlegroups.com
Hi Matt,

First of all, I would forget all about "live" access unless it's absolutely necessary. The performance is usually horrendeous.

I would recommend only working with locally cached, linked tables with a subset of the server data, using some sort of selection criteria.

You may also want to consider utilizing some sort of go-between utility to fetch data as temporary local tables. This would enable handling security sensibly and intelligently, e.g. by unlinking tables for users without write designation.

And design your server tables with an auto numbered integer column named MI_PRINX as primary key. Pro recognizes this name as a primary key.

And when creating views, unless you want to enable updating views (using triggers etc.), excluding or renaming the above key columns will render the fetched datasets readonly in Pro.

All for now, but SQL 2008 is definitely the way to proceed, imho.

Best regards / Med venlig hilsen
Lars I. Nielsen
GIS & DB Integrator
GisPro

robert crossley

unread,
Jan 4, 2012, 1:41:00 PM1/4/12
to mapi...@googlegroups.com
Matt,

All of what you say is very possible with SQL server/ Mapinfo. We've been
using SQL with MapInfo for almost 10 years in a few corporate environments,
(albeit with Spatialware until the last 3 years or so). Definitely the way
to go.

Views using more than one table are a very efficient way of making sense of
corporate data structures, and are easy to implement. Further they are not
editable unless you put triggers on them to edit the base tables on those
views, so even if users have editing permissions on the database, they can't
with views. If you join them to data that has a tag that can be used to
filter data for particular users, you can open linked tables that filter
that data so they only see the data they need. Remember that you can create
views that include spatial joins as well.

Linked tables are quite efficient in refreshing the data, so perhaps look at
those as a way of maintaining the currency of the data. Not sure how that
would work if all users are using the same tables, but we have it so each
user has a local linked table, and refreshes that as needed either manually
or programmatically.

It definitely helps if you have some creative database administrators,
although it will take time to get them to understand spatial. There are a
few people that are on this list that have a fair bit of experience in the
area, and they have helped a lot over the years.

Rob.

Hi all,

Many thanks,

Matt

--
You received this message because you are subscribed to the Google Groups
"MapInfo-L" group.To post a message to this group, send email to
mapi...@googlegroups.com To unsubscribe from this group, go to:
http://groups.google.com/group/mapinfo-l/subscribe?hl=en
For more options, information and links to MapInfo resources (searching
archives, feature requests, to visit our Wiki, visit the Welcome page at
http://groups.google.com/group/mapinfo-l?hl=en

Peter Horsbøll Møller

unread,
Jan 5, 2012, 2:22:19 AM1/5/12
to mapi...@googlegroups.com
Robert,
It's my experience that you can edit some of the data in a view. 
Not those parts that have been joined into the base table.
But the data that comes from the base table can be edited from within MapInfo Pro (and other applications probably)

I does however require that the primary index column is named MI_PRINX.

Peter Horsbøll Møller
Pitney Bowes Business Insight - MapInfo


2012/1/4 robert crossley <rob...@wotzhere.com>

Peter Horsbøll Møller

unread,
Jan 5, 2012, 2:27:14 AM1/5/12
to mapi...@googlegroups.com
Lars,
Have you played a bit with the enhanchments we have made to Live with Cache in the last couple of versions?
In 9.5.2 we changed the way we manage and maintain the cache and made it more sticky.
In 10.5.2 we added support for MARS connections to SQL Server, which really speeds up the access to the database, especially for Live access.

But I agree, you really need to consider your needs.
To me it's not a choice of using one or the other, but using the one that fits the purpose.
In some cases Live with Cache makes sense, in other cases Linked is the best solution.

Linked has been my favorite for a long time, but I'm started to lean more towards Live with Cache

Peter Horsbøll Møller
Pitney Bowes Business Insight - MapInfo


2012/1/4 Lars I. Nielsen (GisPro) <L...@gispro.dk>

Matt

unread,
Jan 5, 2012, 5:30:08 AM1/5/12
to MapInfo-L
Thank you all for your replies.

I just wanted to ask about the references to ‘local’ linked tables.
Corporately, our data is currently made available to MapInfo users
from a server (‘the V: drive’), and is in flat .tab format.

As I understand it, if I set these .tab files up to be linked tables
on the V: dive they need to be refreshed (fetching the data from the
SQL Server database), to create a more-or-less flat .tab file (with
all the usual .map, .dat, .id files). Is this what everyone means
when referring to ‘local’ linked tables? Or do you mean that each
officer has a linked table .tab file on their local machine?

If it’s the latter, I don’t understand how that could work as an
enterprise solution and if it’s the former I’m still a little unsure
as to how we would keep the data ‘refreshed’.

I don’t want to have to explain to desktop users that they need to
remember to refresh a dataset every time they open it if they want it
to be current.

Ideally I’d like there to be one corporate .tab file for each
dataset. Of course some datasets are completely static which means we
don’t really have to worry about refreshing, but some change every now
and again and others will change constantly by pulling data from other
systems or just by their nature. If I try to refresh one of these
ever changing .tab files while a dozen people have got it open, what
happens?

Matt

Lars I. Nielsen (GisPro)

unread,
Jan 6, 2012, 1:19:45 AM1/6/12
to mapi...@googlegroups.com
Hi Matt,

Local linked tables are indeed locally cached tables with server linkage.

They work just like a normal file based TAB, but has additional metadata to enable refreshing with a server, which is why they're dubbed "linked".

So yes, linked tables need to be refreshed on a regular basis, otherwise you're working with outdated information.

If that's a no-go, you might want to take your chances with "live" tables, that has no local cache but saves and fetches data from the server on.the.fly, perhaps with just a small automatically managed cache ("live" with cache).


Best regards / Med venlig hilsen
Lars I. Nielsen
GIS & DB Integrator
GisPro

Matt

unread,
Jan 11, 2012, 5:06:57 AM1/11/12
to MapInfo-L
Thanks Lars,

Does anyone have any experience of how to keep linked tables up-to-
date?

As I’ve already mentioned, the performance of live tables appears to
be less than ideal so linked tables would appear to be the way to go.
However, we could potentially end up with several hundred ‘flat’
datasets in SQL Server and any number of view-driven tab files.

If I have a view driven tab file which, say, displays the number of
properties in each administrative area (a simple spatial join between
the two datasets with a count in SQL Server), how do I keep this
refreshed? A live table for this sort of dataset would be far too
slow and the address dataset is dynamic to the point that the numbers
would change every day as new houses are built, old houses are
demolished and large houses are converted to apartments.

Obliviously I could write a simple workspace which gets fired every
night to open the table, refresh it and close it. That sort of update
frequency would probably be sufficient for our requirements. But what
if I’ve got ten of these view-driven tab files? What if I’ve got
100? Or 500? How do you ensure that the data remains current in them
all?

Cheers

Peter Horsbøll Møller

unread,
Jan 12, 2012, 8:05:02 AM1/12/12
to mapi...@googlegroups.com
Matt,
There are a number of ways of doing so.

A basic workspace is one way - and very easy to setup. As you mention yourself it might however be harder to maintain as the number of linked tables start to grow.

Another way could be to create a VBS script that finds tables on your share and writes a workspace to refresh these. I have seen such an example. That does however require that you only pick up linked tables.

A third suggestion would be to write a tool (MapBasic for instance) that searches a certain folder (and subfolders if you like) opens each of the tables found and refreshes the table, if it is a linked table.
I have such a tool in beta nad I'm going to show this at the MUGUKI in London in two weeks.

None of the solutions above a complete bullet proof. The best one is probably the third. But even that can be stopped by a table that for some reason doesn't open or requires a password to be entered when trying to open it.

But I'm sure these might give you some ideas

Peter Horsbøll Møller
Pitney Bowes Business Insight - MapInfo


2012/1/11 Matt <matt_o...@yahoo.co.uk>
Reply all
Reply to author
Forward
0 new messages