a perhaps even more spectacular solution is to use the billions or uri's
to identify things all ready out there, if I may nudge your attention
towards dbpedia for instance which is wikipedia, turned in to machine
readable data, for the very purpose you say, with the added benefit that
when you look up a uri for a thing, you get mountains of machine
readable data back about that thing (inc all wikipedia info) and links
to wikipedia, and interlinks between datasets (like geonames and many
many more).
I really have to point out that this is re-inventing the wheel, it's
called "linked data", invented by Tim Berners-Lee (you may recognise the
name), has been proven, deployed on mass, adopted all ready by major
governments and institutions around the world, is the definition or web
3 / web of data, is rolled out in RDFa all over including the next
drupal, has a huge following, OpenGraph is linked data (yes open graph),
and as mentioned there are URI's for virtually everything on the planet
existing and ready to use, + a huge interlinking distributed web of
linked data that is in to the many billions of bits of info now.
Linked Data:
- Use URIs as names for things
- Use HTTP URIs so that people can look up those names.
- When someone looks up a URI, provide useful information, using the
standards (RDF, SPARQL)
- Include links to other URIs. so that they can discover more things.
sound familiar?
Sincerely guys, jump in, what you are describing is all ready a reality,
ready for you whenever you want - every topic thus far on openlike has
pretty much pointed to, or described, linked data.
Best,
Nathan