Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Mapping the Web

2 views
Skip to first unread message

Brian Combs

unread,
Feb 22, 1994, 6:23:16 PM2/22/94
to
Ok, imagine that one day while surfing through the Web, you find
some really cool Web Server. Like an idiot, you don't think to
put it in your Hot List. Two days later you have now recollection
of how you got there. You know where the company, university, or
whatever is physically located, but you have no idea what the URL
is or where the link was that took you there. The letters "S.O.L."
come to mind...

It doesn't have to be that way, however. It would be very simply
to create a geographical hierarchy of Web servers if we all got
together and did it. Anyone who has used Gopher has an idea what
this would look like (heck, it could even include gopher and/or
ftp servers; we could map the whole Internet! Ok, I'll stop dream-
ing).

I have already begun the process of doing this by creating a list
of Austin, Texas-based Web servers. The URL is:

http://www.quadralay.com/www/Austin/WebServers.html

I would be willing create a list of pointers to lists of Web
servers in other cities in Texas, and I imagine I could handle
doing a United States list, but I don't really want to do any
others (I want this to be a group project).

I would also be willing to help coordinate things.

I'm not suggesting that this should replace subject-based lists,
but rather it should compliment them.

I would love to hear your thoughts and comments about this either
in this newsgroup or via e-mail. I will, however, be in L.A. at
Documation '94 on Wednesday and Thursday, so unless you catch me
before I leave tonight (Tuesday), I won't be able to respond until
I get back to the office on Friday.


Brian Combs
Co-founder, Austin WWW User's Group
co...@quadralay.com

--
**********************************************************************
* Brian Combs * Tel: 512-346-9199 Fax: 512-794-9997 *
* Quadralay Corporation * FTP Address: ftp.quadralay.com *
* combs @ quadralay.com * WWW Server: www.quadralay.com *
**********************************************************************
"Software Development Tools That Work The Way You Do!"


Jon Zeeff

unread,
Feb 22, 1994, 9:47:28 PM2/22/94
to
Is anyone doing a Mosaic service where one could search all the <TITLE>s
of all known http servers for specified keywords?

N F Drakos

unread,
Feb 23, 1994, 4:07:57 AM2/23/94
to
Jon Zeeff (ze...@zip.eecs.umich.edu) wrote:
: Is anyone doing a Mosaic service where one could search all the <TITLE>s

: of all known http servers for specified keywords?

Yes. It is called the "JumpStation" at
http://www.stir.ac.uk/jsbin/js


Nikos.

--
Nikos Drakos
Computer Based Learning Unit ni...@cbl.leeds.ac.uk
University of Leeds http://cbl.leeds.ac.uk/nikos/personal.html

Dennis Barnden

unread,
Feb 23, 1994, 4:17:41 AM2/23/94
to
In article <2keg40$5...@zip.eecs.umich.edu>, ze...@zip.eecs.umich.edu (Jon
Zeeff) wrote:

> Is anyone doing a Mosaic service where one could search all the <TITLE>s
> of all known http servers for specified keywords?

Is this not what The Jump Station does?

http://www.stir.ac.uk/jsbin/js


--
Dennis Barnden dbar...@uniwa.uwa.edu.au
"Ray was not dead, he had only gone to the dentist." G.Keillor. "WLT"

Brandon S. Plewe

unread,
Feb 23, 1994, 10:34:35 AM2/23/94
to

>It doesn't have to be that way, however. It would be very simply
>to create a geographical hierarchy of Web servers if we all got
>together and did it. Anyone who has used Gopher has an idea what
>this would look like (heck, it could even include gopher and/or
>ftp servers; we could map the whole Internet! Ok, I'll stop dream-
>ing).

I think it's a great idea; in fact so do all the rest of the people
who have been contributing to just such a system for the past 3 months!
:-) I believe the folks in Norway have to be credited with the
original idea. The local-level indexes, distributed around the
world, are catalogued in my Virtual Tourist service:
http://wings.buffalo.edu/world

We are still nowhere near completion, however, so we can use all the
help we can get (i.e. a map/list of Texas sites). Make a map/list
of your state or country, and let me know! ($100 to whoever has the
guts to try California :-)

So far there are no standards for the maps, so you're welcome to
experiment. I would like to see the maps be of a high quality (accuracy or
artistically), though.

CERN has also tried to keep a geographically-sorted list of WWW sites
since the beginning. However, it is woefully out of date.

I agree that this is not as generally useful as subject-oriented indexes
for finding information, but still good for a comprehensive catalog.

One suggestion for contributors--make sure your e-mail address is on there
for people to submit additions/corrections.

Brandon Plewe
Assistant Coordinator, Campus-Wide Information Services
SUNY/Buffalo
pl...@acsu.buffalo.edu

Mr Jonathon Fletcher

unread,
Feb 23, 1994, 3:34:10 PM2/23/94
to
In article <dbarnden-2...@mac5.dentistry.uwa.edu.au>, you write:
|> Is anyone doing a Mosaic service where one could search all the <TITLE>s
|> of all known http servers for specified keywords?
|
|Is this not what The Jump Station does?
|
| http://www.stir.ac.uk/jsbin/js
|

Yes, this is what the JumpStation does. It does not contain ALL the
server, nor is it every likely to - the Web is too big. However, for a
first attempt, it appears to be usable.

-Jon

PS: A note on robots, because they are so topical at the moment. The
current JumpStation Robot has been retired. It will not run again.
Version II is being written as fast as I can type. It will take note
of the Guidelines for Robots (see:

http://web.nexor.co.uk/mak/doc/robots/robots.html

for more detail), in addition to a couple of my own. I hope to have it
running again sometime in the next couple of weeks.

--
Jonathon Fletcher, Information Services, Stirling University. (7273 int.)
j.fle...@stirling.ac.uk (X400: "/S=jf1/O=stirling/PRMD=uk.ac/C=gb/")
WWW Home Page: http://www.stir.ac.uk/~jf1

Jeffrey R. Harrow

unread,
Feb 23, 1994, 8:00:15 PM2/23/94
to
What is REALLY needed (IMHO), is a graphical mapping tool that
automatically keeps the topology current, allows users to navigate at
will, and provides multiple dimensions --- such as geographic, subject
matter, etc...

Since we're dreaming...

Jeff

Gary R Wright

unread,
Feb 23, 1994, 11:04:17 PM2/23/94
to
In article <CLooL...@acsu.buffalo.edu>,

Brandon S. Plewe <pl...@acsu.buffalo.edu> wrote:
>>It doesn't have to be that way, however. It would be very simply
>>to create a geographical hierarchy of Web servers if we all got
>>together and did it. Anyone who has used Gopher has an idea what
>>this would look like (heck, it could even include gopher and/or
>>ftp servers; we could map the whole Internet! Ok, I'll stop dream-
>>ing).

How about creating a standard html document that included basic
server information including latitude and longitude? The documents
could be collected by a robot and converted into a usable map.

How about a standard naming systems for html documents? Sort of like
you expect to find programs in /usr/bin on a Unix system, there should
be a similar convention for http servers. I'm not claiming the entire
namespace has to be standardized, but just a part of it.

Nelson Minar

unread,
Feb 24, 1994, 3:31:44 AM2/24/94
to
Why map the web geographically? One of the nicest things about it is
that I'm not at all aware of where, physically, the documents I'm
fetching are from (at least, until I try squeezing something through a
satellite link. Bleah.) The only advantage of mapping by geography is
that it's fairly easy to determine.

Much better would be to map the web conceptually. Various people are
trying to do this, with varying degrees of success. Note that
"physical location" is one conceptual grouping, especially if you're
looking for map servers or local area libraries.

I'd like to see a graph of the Web, or maybe just Web sites. If you
know all the links in the Web already (presumably, by roaming the web)
there is software to make a picture out of it.

__ http://www.reed.edu/~nelson/
nel...@reed.edu \/ Nothing is true; everything is permitted

Jon Zeeff

unread,
Feb 24, 1994, 10:37:37 AM2/24/94
to
I'm not sure what relevance physical location is in this virtual world, but
I'd be willing to include latitude and longitude in home pages if there
were a standard for it.

Gary R Wright

unread,
Feb 24, 1994, 11:00:18 AM2/24/94
to
In article <2kholg$c...@scratchy.reed.edu>,

Nelson Minar <nel...@reed.edu> wrote:
>Why map the web geographically? One of the nicest things about it is
>that I'm not at all aware of where, physically, the documents I'm
>fetching are from (at least, until I try squeezing something through a
>satellite link. Bleah.) The only advantage of mapping by geography is
>that it's fairly easy to determine.

One of the weaknesses with gopher, www, nested menus, etc. is that the
user has no way of constructing a visual or spacial model of the
gopherspace, the web, or the menus. Humans are very good at
manipulating visual/spacial information and almost everyone already has
a workable geographical model in their heads already!

For example, when I go to the local library to look at the books about
the Internet. I find the books by remembering that they are upstairs
and against the far left wall. I don't remember that to find the
books I enter the building turn left to the reference section, up one
flight of stairs, straight to next flight of stairs, up those stairs,
around the railing, down the walkway and down the aisle to the books on
the left.

I have a "map" of the library in my head. A blind person must remember
all those details--that demonstrates the fantastic visualization
power of the brain--it doesn't argue for us to be satisfied with our
current tools.

Other (non-geographic) models may also be appropriate but they would
have to be learned from scratch. Maybe we need to start "designing" a
map of cyberspace--just like early explorers mappping an unknown
terriotory. The problem with cyberspace is that the relationships are
so fluid it is hard to map them in any static manner.

Marc VanHeyningen

unread,
Feb 24, 1994, 11:23:43 AM2/24/94
to
Thus said ze...@zip.eecs.umich.edu (Jon Zeeff):

>I'm not sure what relevance physical location is in this virtual world, but
>I'd be willing to include latitude and longitude in home pages if there
>were a standard for it.

This also would permit cheesy location-oriented features (e.g. the
spinning globe in Mosaic could have a little blinking light indicating
the location you're connected to.) More stuff to impress the easily
impressed. :-)
--
Marc VanHeyningen mvan...@cs.indiana.edu MIME, RIPEM & HTTP spoken here

Johnson M J

unread,
Feb 24, 1994, 1:23:27 PM2/24/94
to
My own idea for mapping the web is that everyone makes their own little map, then you add together everyones maps. The two problems are a) how do you express your map, b) how do you scale up to _everyones_ maps? I am trying out my solution to the first problem - a subject space. The subject space is URL's storred by subjects (keywords) - so you can view it as a keyword searchable database, but when a URL is storred under two subjects you can consider those subjects to be related and indead form a compis
ite subject - a simple semantic net? Subjects can also have child subjects - so you can have a directed graph or tree view. This means that people can make whatever sort of maps they like, and existing maps can be used to. How it can scale is what I need experiment with. But I hear you asking, why will anyone want to make their own map?
Well, I'll soon finish an applet that replaces the mosaic hotlist with an easy to use interface to my subject space. The pitch is - is your hotlist long and cumbersome?
I'm sure you get the idea.

I got the idea by looking at how to combine searching and browsing, and what is the motivation for poeple to spend their time catologing the internet (as I think automatic methods just aren't good enough!)

If you are interested, mail me and I'll keep you informed of progress. If anyone wants to help by evaluating the hotlist replacement, or if you just want to get your hands on it before everyone else, mail me soon.

---


Mark Johnson (m.j.j...@qmw.ac.uk) __o
Department of Electronic Engineering \<,
Queen Mary and Westfield College, U. of London _______________()/ ()___

Paul J. J. Harrington

unread,
Feb 24, 1994, 3:12:05 PM2/24/94
to
In <CLpEs...@mv.mv.com> har...@mv.mv.com (Jeffrey R. Harrow) writes:

>What is REALLY needed (IMHO), is a graphical mapping tool that
>automatically keeps the topology current, allows users to navigate at
>will, and provides multiple dimensions --- such as geographic, subject
>matter, etc...

http://www.dsg.cs.tcd.ie:1969/afc_draft.html

is a _very_ rough start at automated maps. It does work very well at
the moment since I have not put in any work on scaling and other
essentials. Also you need sound, zoom in/out etc.

I have an external viewer which can do a bit more but I still have to
build the graphs via the AT&T mailserver so interactive response is
not all that great.

>Since we're dreaming...

dream ... and hack!

>Jeff
pjjH

--
Paul Harrington, phrr...@dsg.cs.tcd.ie, phrr...@gallimaufry.ie +353 88 599673
Dept. Computer Sci., Trinity College, University of Dublin, Dublin 2, Ireland.

Miles O'Neal

unread,
Feb 24, 1994, 11:03:29 PM2/24/94
to
gwr...@world.std.com (Gary R Wright) writes:
>One of the weaknesses with gopher, www, nested menus, etc. is that the
>user has no way of constructing a visual or spacial model of the
>gopherspace, the web, or the menus. Humans are very good at
>manipulating visual/spacial information and almost everyone already has
>a workable geographical model in their heads already!

I think the problem is more that everyone has their own model, and
most people's model is totally ad hoc, so that it might as well
be random.

>Other (non-geographic) models may also be appropriate but they would
>have to be learned from scratch. Maybe we need to start "designing" a
>map of cyberspace--just like early explorers mappping an unknown
>terriotory. The problem with cyberspace is that the relationships are
>so fluid it is hard to map them in any static manner.

Or, the problem is that htere are no standards. I can
think of several topologies - they simply need some
sort of visualization paradigm, as it were, to go along
with them.

For instance, the information can be thought of as a
hierarchy - say, a tree. I know that a certain bird,
say, EXUG, lives near the top, in the X section. You,
on the other hand, might know that it lives in the lower
westhand corner of the infocube, in the user group warren.
Someone else might see it as being in the social services
part of the government buildings downtown.

Once we have the tools to maintain the database, we need
the tools to let each one visualize as they desire. Granted,
a few common paradigns will emerge and (or be developed to)
really drive things for the masses. After all, for your
average PC user, everyone having their own paradigm is
probably worse than the current state of affairs (which is
totally higgledy-piggledy, in the words of Michael Binkley).

Eventually, tho, I think the ideal software would take care
of the details for us, and let each of us see things as we
wished. In Cyberspace, I can hand you a flower, while you
receive a banknote from me, and Milo sees me giving you a
Nutshell handbook. The software does it.

Yes, this is a long way off from your average newsgroup
being read in your average newssreader, but a smigeon
closer in Mosaic.

-Miles

Brian Behlendorf

unread,
Feb 24, 1994, 11:18:43 PM2/24/94
to
>In article <CLooL...@acsu.buffalo.edu>,
>Brandon S. Plewe <pl...@acsu.buffalo.edu> wrote:
>>>It doesn't have to be that way, however. It would be very simply
>>>to create a geographical hierarchy of Web servers if we all got
>>>together and did it. Anyone who has used Gopher has an idea what
>>>this would look like (heck, it could even include gopher and/or
>>>ftp servers; we could map the whole Internet! Ok, I'll stop dream-
>>>ing).

What would make this really easy would be if the various gopher and
web servers could implement the global coordinates of the machine they're
running on as global variables that it could be queried. Web-walkers
could extract these, and people wishing to build maps of areas wouldn't
have to try and "co-ordinate" anything, just query a bunch of machines.

Brian

Jamie Zawinski

unread,
Feb 25, 1994, 6:47:56 PM2/25/94
to
In comp.infosystems.www Greg O'Rear <jgo.s...@mhs.unc.edu> wrote:

>
> ze...@zip.eecs.umich.edu (Jon Zeeff) says:
>> I'd be willing to include latitude and longitude in home pages
>
> Don't forget altitude...let's make this a 3-D map. :-)

That's computable given lat/long.

(Check out xearth on ftp.x.org. It's cool.)

-- Jamie

Miles O'Neal

unread,
Feb 26, 1994, 1:08:07 AM2/26/94
to
Cato.A...@ntdh.no (antonsen) writes:
>
>What a great idea! If everybody could include a <LOCATION ????> in their html-document,
>then the www-client could extract that information and show the location on a map!!!!
>Of course it had to be defined as an standard...

Easy enough - use what the uucp maps use. Just make
it right! 8^)

How about <!--GEOPOSITION: posyition_in_uu_format--> ?

-Miles

James Eric Tilton

unread,
Feb 26, 1994, 1:17:25 PM2/26/94
to
In article <CLpEs...@mv.mv.com>, Jeffrey R. Harrow <har...@mv.mv.com> wrote:
>What is REALLY needed (IMHO), is a graphical mapping tool that
>automatically keeps the topology current, allows users to navigate at
>will, and provides multiple dimensions --- such as geographic, subject
>matter, etc...

Ok, I've got a simpler solution. This is something that occurred to me
way back in July, when I first saw a beta of Mosaic (and went, "Wow!").

What I would love to have is a map not of the Web, but where I've
been. Kind of like the auto-mapping features of an Infocom-esque
game? I'd like a map that describes the nodes I've been to in this
session, and -- this is the kicker -- indicates where there are nodes
I haven't yet explored. You could tie this into the Mosaic global
history, and there'd you'd be.

This way, instead of simply the linear mapping of "history", you've got
something more akin to the threading of Usenet. You could quickly visualize
how much of the current eddy of webspace you've seen, or hop between different
paths of exploration. (Sure, you could do this by cloning windows, but
this'd be better for the visually oriented :).

-et

/ (James) Eric Tilton, Student AND Student Liaison, WITS \
\ Class of '95 - CS/Hist -- Internet - jti...@willamette.edu /
<a href="http://www.willamette.edu/~jtilton/">ObHyPlan!</a>, chock fulla
<a href="http://www.willamette.edu/~jtilton/whatsnew.html">Fun Stuff!</a>


Chris Beaumont

unread,
Feb 26, 1994, 2:39:18 AM2/26/94
to
In article <1994Feb24....@dsg.cs.tcd.ie> phrr...@dsg.cs.tcd.ie (Paul J. J. Harrington) writes:
>In <CLpEs...@mv.mv.com> har...@mv.mv.com (Jeffrey R. Harrow) writes:
>
>>What is REALLY needed (IMHO), is a graphical mapping tool that
>>automatically keeps the topology current, allows users to navigate at
>>will, and provides multiple dimensions --- such as geographic, subject
>>matter, etc... ^^^^^^^
> ^^^^^^
>
I have a friend who is working on a secretive program (an external viewer
for mosaic that uses 3D hardware..) that may do subject-travel..He claims
that ultimately it will
allow exactly this..with various 'styles' for different data-types..
(as well as a infinitely extensible virtual web-space
that can be represented visually..) He's already worked with object
oriented VR operating systems quite a bit,so I'm taking this seriously !
Chris.

Greg O'Rear

unread,
Feb 25, 1994, 6:03:32 AM2/25/94
to
ze...@zip.eecs.umich.edu (Jon Zeeff) says:
>I'd be willing to include latitude and longitude in home pages

Don't forget altitude...let's make this a 3-D map. :-)

Brian Combs

unread,
Feb 25, 1994, 10:38:14 AM2/25/94
to
In article <2kholg$c...@scratchy.reed.edu>,
Nelson Minar <nel...@reed.edu> wrote:
>Why map the web geographically? One of the nicest things about it is
>that I'm not at all aware of where, physically, the documents I'm
>fetching are from (at least, until I try squeezing something through a
>satellite link. Bleah.) The only advantage of mapping by geography is
>that it's fairly easy to determine.
>
>Much better would be to map the web conceptually. Various people are
>trying to do this, with varying degrees of success. Note that
>"physical location" is one conceptual grouping, especially if you're
>looking for map servers or local area libraries.

The down shot of this is that you might remember where a Web server
was (say U.C. Berkeley, for example), but you might not remember how
you got there or what catagory you found it under. As the Web gets
bigger and bigger, this will be more and more of a problem.

What I am suggesting is not a replacement, but rather an enhancement
to current topic-based lists.

>I'd like to see a graph of the Web, or maybe just Web sites. If you
>know all the links in the Web already (presumably, by roaming the web)
>there is software to make a picture out of it.

Hmmm... That could be interesting.


Brian Combs
AWWWUG

Steinar Bang

unread,
Feb 25, 1994, 3:38:31 AM2/25/94
to
>>>>> "MVH" == Marc VanHeyningen <mvan...@cs.indiana.edu> writes:

MVH> This also would permit cheesy location-oriented features (e.g.
MVH> the spinning globe in Mosaic could have a little blinking light
MVH> indicating the location you're connected to.) More stuff to
MVH> impress the easily impressed. :-)

Hey! *Hey*! Don't y'all go on stealing my ideas y'hear! I've already
suggested that...:-)

Fled Fairlane

unread,
Feb 27, 1994, 11:18:44 AM2/27/94
to

Why not make it an internet service with a well-known port number- this would
allow other programs than those that use HTML to retrieve the lat/long
of the place in question. Seems it would be trivial to then write a gateway
for the Web to read this information.

Could do a lot of neat stuff with something like this...

=O.| fled
=O

antonsen

unread,
Feb 25, 1994, 3:10:06 AM2/25/94
to

What a great idea! If everybody could include a <LOCATION ????> in their html-document,


then the www-client could extract that information and show the location on a map!!!!

Of course it had to be defined as an standard...

Cato Antonsen
System operator


Christopher Davis

unread,
Feb 28, 1994, 7:17:10 PM2/28/94
to
FF> == Fled Fairlane <fl...@watarts.uwaterloo.ca>

FF> Why not make it an internet service with a well-known port number-
FF> this would allow other programs than those that use HTML to retrieve
FF> the lat/long of the place in question. Seems it would be trivial to
FF> then write a gateway for the Web to read this information.

There is work going on to define a DNS record for lat/long/alt
information. The DNS is the proper place for information about hosts,
after all. (This way you can define a location for a machine that can't
run a "lat/long service", like a PC.)
--
* Christopher Davis * <c...@kei.com> * (was <c...@eff.org>) * MIME * [CKD1] *
"It's 106 ms to Chicago, we've got a full disk of GIFs, half a meg of
hypertext, it's dark, and we're wearing sunglasses." "Click it."

TORSTEN EYMANN

unread,
Mar 1, 1994, 6:50:06 AM3/1/94
to
Try
http://www.cs.tu-berlin.de/germ/server_liste.html
for a list of german servers, sorted by cities.

Stewart Clamen

unread,
Mar 2, 1994, 12:01:12 AM3/2/94
to
In article <JWZ.94Fe...@thalidomide.lucid.com> j...@lucid.com (Jamie Zawinski) writes:

In comp.infosystems.www Greg O'Rear <jgo.s...@mhs.unc.edu> wrote:
>
> ze...@zip.eecs.umich.edu (Jon Zeeff) says:
>> I'd be willing to include latitude and longitude in home pages
>
> Don't forget altitude...let's make this a 3-D map. :-)

That's computable given lat/long.

What? Doesn't building storey matter?

:-)

--
Stewart M. Clamen Internet: cla...@cs.cmu.edu
School of Computer Science UUCP: uunet!"cla...@cs.cmu.edu"
Carnegie Mellon University Phone: +1 412 268 2145
5000 Forbes Avenue Fax: +1 412 681 5739
Pittsburgh, PA 15213-3891, USA (I accept MIME,HTML,Hyperbole,PGP)

Gary Benson

unread,
Mar 2, 1994, 10:29:37 AM3/2/94
to


Oh come on. Given LAT and LONG, shouldn't the concrete reference syntax give
the altitude?

--
Gary Benson-_-_-_-_...@tc.fluke.com_-_-_-_-_-_-_-_-_-_-_-_-_-_-

It would be thought a hard government that should tax its people
one tenth part. -Benjamin Franklin

Gary Benson

unread,
Mar 2, 1994, 10:35:42 AM3/2/94
to
In article <jgo.systems...@mhs.unc.edu> jgo.s...@mhs.unc.edu (Greg O'Rear) writes:

Oh come on. Give LAT and LONG, shouldn't the concrete reference syntax
provide the altitude?

Bjoern Stabell

unread,
Mar 2, 1994, 12:34:03 PM3/2/94
to
In article <CM1n1...@tc.fluke.COM>, i...@tc.fluke.COM (Gary Benson) writes:

] In article <jgo.systems...@mhs.unc.edu> jgo.s...@mhs.unc.edu (Greg O'Rear) writes:
] >ze...@zip.eecs.umich.edu (Jon Zeeff) says:
] >>I'd be willing to include latitude and longitude in home pages
] >
] >Don't forget altitude...let's make this a 3-D map. :-)
]
] Oh come on. Given LAT and LONG, shouldn't the concrete reference syntax give
] the altitude?

Well, to a certain degree but it would be very hard to extract
this information. Also, what about the case where there are
perhaps 5 WWW servers in a place like the World Trade Centre? :)


Bye,
--
Bjørn Stabell
(bjo...@staff.cs.uit.no)

Jarle Brinchmann

unread,
Mar 2, 1994, 12:37:55 PM3/2/94
to

In article <CM1nB...@tc.fluke.COM>, i...@tc.fluke.COM (Gary Benson) writes:
|>In article <jgo.systems...@mhs.unc.edu> jgo.s...@mhs.unc.edu (Greg O'Rear) writes:
|>>ze...@zip.eecs.umich.edu (Jon Zeeff) says:
|>>>I'd be willing to include latitude and longitude in home pages
|>>
|>>Don't forget altitude...let's make this a 3-D map. :-)
|>
|>Oh come on. Give LAT and LONG, shouldn't the concrete reference syntax
|>provide the altitude?

No, latitude and longitude suffices to specify the location on the 2-D
earth (ie. on the imagined spherical earth), if you want to report how
high above sealevel you are, you have to specify the altitude too.

Jarle.

---------------------------------------------------------------------
Nuke the Whales ! | Jarle Brinchmann,
| Email: Jarle.Br...@astro.uio.no
International Krill Union. | or : jar...@astro.uio.no

William J Rehm

unread,
Mar 3, 1994, 1:15:11 PM3/3/94
to
On 24 Feb 94 16:00:18 GMT, Gary R Wright wrote:
: One of the weaknesses with gopher, www, nested menus, etc. is that the

: user has no way of constructing a visual or spacial model of the
: gopherspace, the web, or the menus. Humans are very good at
: manipulating visual/spacial information and almost everyone already has
: a workable geographical model in their heads already!

: For example, when I go to the local library to look at the books about
: the Internet. I find the books by remembering that they are upstairs
: and against the far left wall. I don't remember that to find the
: books I enter the building turn left to the reference section, up one
: flight of stairs, straight to next flight of stairs, up those stairs,
: around the railing, down the walkway and down the aisle to the books on
: the left.

: Other (non-geographic) models may also be appropriate but they would


: have to be learned from scratch. Maybe we need to start "designing" a
: map of cyberspace--just like early explorers mappping an unknown
: terriotory. The problem with cyberspace is that the relationships are
: so fluid it is hard to map them in any static manner.

Actually, early maps worked exactly the way you describe the "long way
around". Cartographers didn't have the perspective required to produce
maps like we use today.

I don't really see any method to accomplish a meaningful mapping of the
Web. It's too dynamic. An alternative approach is to develop some set of
"blazes" that robots/agents could use to build "trails" for their owners.
Something analogous to the Dewey system or the new one ( I forget it's
name) that libraries use. Each URL would have these blazes/keys
associated with it, allowing agents applying boolean constraints to
search out the right paths.

Just thinking "out loud". Feel free to point out any problems with this
approach, obvious or not.

Bill --
_______oOOo________oOOo_______________________________________________
| Bill Rehm
|"Somedays, it's all I can do just to hold on." wjr...@pitt.edu

Brian Combs

unread,
Mar 3, 1994, 3:33:23 PM3/3/94
to
In article <16...@blue.cis.pitt.edu>, William J Rehm <wjr...@pitt.edu> wrote:
>
>Actually, early maps worked exactly the way you describe the "long way
>around". Cartographers didn't have the perspective required to produce
>maps like we use today.

And early artists painted on cave walls. Should all artists do that
now? ;-)

>I don't really see any method to accomplish a meaningful mapping of the
>Web. It's too dynamic.

I disagree. I have already set up a list of Austin-based Web, Gopher
and FTP servers. The URL is:

http://www.quadralay.com/www/Austin/InfoServers.html

From what I can tell, the Web and Gopher lists are fairly complete
at this moment. Granted, there are *lots* of FTP servers that aren't
on my list, but I've asked people on the austin.general newsgroup to
send me theirs.

How much time did this take? Not much. I wrote four HTML pages and
once in a while I add new info to it.

I'm trying to persuade people in other Texas cities to make similiar
lists and then I will make a page with pointers to those lists.

The trick is that the effort of building and maintaining these lists
is spread out over numerous people. No one person has to do very
much.

>An alternative approach is to develop some set of
>"blazes" that robots/agents could use to build "trails" for their owners.
>Something analogous to the Dewey system or the new one ( I forget it's
>name) that libraries use. Each URL would have these blazes/keys
>associated with it, allowing agents applying boolean constraints to
>search out the right paths.

I'm not too much of a programmer, but wouldn't that entail redoing
the way that servers are currently addressed?

>Just thinking "out loud". Feel free to point out any problems with this
>approach, obvious or not.

Me too.

Brian Combs
Co-founder, Austin WWW User's Group

--
**********************************************************************
* Brian Combs * Tel: 512-346-9199 Fax: 512-346-8990 *

Paula Burch

unread,
Mar 8, 1994, 1:37:44 PM3/8/94
to

Jeffrey R. Harrow <har...@mv.mv.com> wrote:
>What is REALLY needed (IMHO), is a graphical mapping tool that
>automatically keeps the topology current, allows users to navigate at
>will, and provides multiple dimensions --- such as geographic, subject
>matter, etc...

I have a simpler question. Does anyone have a tool that will construct
a map of local WWW files? It would be handy to be able to construct a
map showing what files link to which other files, especially if, say
several people are working on some WWW files, and the first person was
less than perfectly logical in setting things up, it would be handy to
be able to construct a map showing what files link to which other files.
I'm not talking about anything interactive, just a snapshot of the layout
at one time.

This seems simple and useful enough that surely someone's already
written it....

________________________________________________________________________
Paula E. Burch, Ph.D. Molecular Biology Computational Resource
Baylor College of Medicine internet: pbu...@bcm.tmc.edu
Houston, Texas 77030 phone: (713)798-6023 fax: (713)790-1275

Michael Richardson

unread,
Mar 8, 1994, 5:28:26 PM3/8/94
to
In article <2liglo$e...@gazette.bcm.tmc.edu>,

Paula Burch <pbu...@cmb.bcm.tmc.edu> wrote:
>I have a simpler question. Does anyone have a tool that will construct
>a map of local WWW files? It would be handy to be able to construct a

Yes... sort of. It winds up producing a FIG file, and leaves you to
edit the file into a reasonably understood mess. (Hint... set "smart
links" to "move")

The other problem, is that the FIG files that result tend to be
36"x22" (at least mine was after editing). While it is easy to produce
multipage postscript, this tends to crash my printer. I've had success
writing PPM via ghostscript, and printing on a (colour) deskjet 500
via GIF+CorelDraw. [There is a better way though]
I just tried using the laserjet2 driver (the printer does HPII
emulation too), with no luck. I've also been working on a "fig" driver
for fig2dev, that would clip the figure to certain boundaries. This
doesn't quite work right.

I used "build-map" to make a list, edited that list to remove a
bunch of things I didn't want, then ran "getlocallinks" to remove
external references, and then ran "links2fig" to produce the graph.
I also threw in an adaption of "gophertree" that I called
"gopher2fig" which does something similar for a gopher tree.

#!/bin/sh
# This is a shell archive (produced by shar 3.50)
# To extract the files from this archive, save it to a file, remove
# everything above the "!/bin/sh" line above, and type "sh file_name".
#
# made 03/08/1994 22:28 UTC by mcr@spiff
# Source directory /files2/home/ecology/mcr/CE/ConceptualMap/dist
#
# existing files will NOT be overwritten unless -c is specified
#
# This shar contains:
# length mode name
# ------ ---------- ------------------------------------------
# 701 -rwxrwxr-x build-map
# 1138 -rwxrwxr-x getlocallinks
# 6941 -rwxrwxr-x gopher2fig
# 1233 -rwxrwxr-x links2fig
#
# ============= build-map ==============
if test -f 'build-map' -a X"$1" != X"-c"; then
echo 'x - skipping build-map (File already exists)'
else
echo 'x - extracting build-map (Text)'
sed 's/^X//' << 'SHAR_EOF' > 'build-map' &&
#!/usr/local/bin/perl
X
eval 'exec /usr/local/bin/perl -S $0 ${1+"$@"}'
X if $running_under_some_shell;
X
require "find.pl";
X
sub linkize {
X local($base,$file)=@_;
X local($old,$contents);
X local(%links);
X open(FILE,"$base") || die "$file: $!";
X $old=$/; $/ = undef; $contents=<FILE>; $/=$old;
X close(FILE);
X
X $contents =~ y/\n//d; # nuke newlines
X
X while(($contents =~ s/HREF=\"([^\"]*)\"(.*)/\2/) && ($ref = $1)) {
X $ref =~ s/(.*)#.*/\1/;
X $links{$ref}++;
X }
X print "$file\t",join(' ',(keys %links)),"\n";
X
}
X
X
sub wanted {
X /^.*\.html$/ &&
X &linkize($_,"$name");
}
X
# Traverse desired filesystems
X
chop($cwd=`pwd`);
chdir('/usr/local/www');
X
&find('.');
X
exit;
X
SHAR_EOF
chmod 0775 build-map ||
echo 'restore of build-map failed'
Wc_c="`wc -c < 'build-map'`"
test 701 -eq "$Wc_c" ||
echo 'build-map: original size 701, current size' "$Wc_c"
fi
# ============= getlocallinks ==============
if test -f 'getlocallinks' -a X"$1" != X"-c"; then
echo 'x - skipping getlocallinks (File already exists)'
else
echo 'x - extracting getlocallinks (Text)'
sed 's/^X//' << 'SHAR_EOF' > 'getlocallinks' &&
#!/usr/local/bin/perl
X
sub dirname {
X local($path) = @_;
X
X $path =~ s,^(.*)/([^/]*)$,\1,;
X return($path);
}
X
while(<>) {
X ($file,@links)=split;
X
X @newlinks=();
X $file =~ s,^\./,,; # nuke leading .
X $file =~ s,/\./,,g; # nuke any weirdness
X if(!($file =~ /^\//)) {
X $file = "/$file";
X }
X
X foreach (@links) {
X s,^http://journal.biology.carleton.ca/,/,; # make local links local
X s,^file://journal.biology.carleton.ca/,/,; # turn local ftp into links
X s,^ftp://journal.biology.carleton.ca/,/,; # turn local ftp into links
X s,^gopher://journal.biology.carleton.ca/,/Gopher/,; # we have local gopher links
X
X next if /^http:/;
X next if /^file:/;
X next if /^ftp:/;
X next if /^gopher:/;
X next if /^telnet:/;
X next if /^wais:/;
X next if /^mailto:/;
X
X s,^\./,,; # nuke leading .
X s,/\./,,g; # nuke any weirdness
X
X if(!/^\//) { # canonical name already
X $dirname=&dirname($file);
X while(/^\.\.\//) { # relative path name
X $dirname=&dirname($dirname);
X s/^\.\.\///; # remove relative path name
X }
X $_="$dirname/$_";
X }
X push(@newlinks,$_);
X }
X
X print "$file\t",join(' ',@newlinks),"\n";
}
X
SHAR_EOF
chmod 0775 getlocallinks ||
echo 'restore of getlocallinks failed'
Wc_c="`wc -c < 'getlocallinks'`"
test 1138 -eq "$Wc_c" ||
echo 'getlocallinks: original size 1138, current size' "$Wc_c"
fi
# ============= gopher2fig ==============
if test -f 'gopher2fig' -a X"$1" != X"-c"; then
echo 'x - skipping gopher2fig (File already exists)'
else
echo 'x - extracting gopher2fig (Text)'
sed 's/^X//' << 'SHAR_EOF' > 'gopher2fig' &&
#!/usr/local/bin/perl
# gophertree v1.0
#
# $Id: gophertree,v 1.3 1993/12/24 01:10:03 morrison Exp morrison $
#
# $Log: gophertree,v $
# Revision 1.3 1993/12/24 01:10:03 morrison
# added -h option to suppress host column in listing;
# added sleep() calls at two places
#
# Revision 1.2 1993/12/24 00:57:44 morrison
# initial version
#
#
# Prints pretty indented listings of a Gopher menu tree
# Copyright (C) 1992, Trustees of Michigan State University
#
# Modifications:
# Original author unknown
# 07/07/92 Boone Major conversion from gopherls
# 08/14/92 Boone Fixes:
# added code to allow command line limit on recursion
# depth
# quit indenting after 15 levels to avoid filling the
# title field with spaces
# changed to use IP address instead of hostname when
# checking for off-host links; this should make
# aliased machine names (e.g. gopher.someschool.edu)
# work much better
# Enhancements:
# Added option to list only directories
# Changed limit on number of items listed to apply to
# all types except directories, instead of just
# files; still no per-type limits though
# Changed command line processing to use Getopts,
# allowing better option processing
# End Modifications
X
require "getopts.pl";
X
sub dokill
{
X kill 9,$child if $child;
}
X
sub Opengopher
{
X $sockaddr++;
X local($them,$port) = @_;
X $them = 'localhost' unless $them;
X
X $AF_INET = 2;
X $SOCK_STREAM = 1;
X
X $SIG{'INT'} = 'dokill';
X
X $sockaddr = 'S n a4 x8';
X
X chop($hostname = `hostname`);
X
X ($name,$aliases,$proto) = getprotobyname('tcp');
X ($name,$aliases,$port) = getservbyname($port,'tcp')
X unless $port =~ /^\d+$/;;
X ($name,$aliases,$type,$len,$thisaddr) = gethostbyname($hostname);
X ($name,$aliases,$type,$len,$thataddr) = gethostbyname($them);
X
X $this = pack($sockaddr, $AF_INET, $sockaddr, $thisaddr);
X $that = pack($sockaddr, $AF_INET, $port, $thataddr);
X
X sleep(2);
X
X # Make the socket filehandle.
X socket(S, $AF_INET, $SOCK_STREAM, $proto) || die $!;
X
X # Give the socket an address.
X bind(S, $this) || die $!;
X
X # Call up the server.
X connect(S,$that) || die $!;
X
X # Set socket to be command buffered.
X select(S); $| = 1; select(STDOUT);
X
}
X
sub GetList
{
X local($CurrentHost, $Port, $Path, $indent,$origin_x,$origin_y) = @_;
X local(@dirx, $Name, $Obj, $fname, $ftype, $fhost, %i, $truncated);
X
X &Opengopher($CurrentHost, $Port);
X print S "$Path\n";
X @dirx = <S>;
X close(S);
X $truncated = 0;
X foreach (@dirx)
X {
X last if /^\./;
X chop; chop;
X ($ObjName, $Path, $CurrentHost, $Port) = split('\t', $_);
X $Name = substr($ObjName, 1);
X $Obj = substr($ObjName, 0, 1);
X $fname = $indent . $Name;
X $ftype = "";
X $ftype = "File" if ($Obj eq "0");
X $ftype = "Dir" if ($Obj eq "1");
X $ftype = "Phone" if ($Obj eq "2");
X $ftype = "Error" if ($Obj eq "3");
X $ftype = "MacHqx" if ($Obj eq "4");
X $ftype = "PcHqx" if ($Obj eq "5");
X $ftype = "Uue" if ($Obj eq "6");
X $ftype = "Index" if ($Obj eq "7");
X $ftype = "Telnet" if ($Obj eq "8");
X $ftype = "Bin" if ($Obj eq "9");
X $ftype = "File" if ($Obj eq "R");
X $ftype = "TN3270" if ($Obj eq "T");
X $ftype = "File" if ($Obj eq "e");
X $ftype = "Ftp" if ($Obj eq "f");
X $ftype = "HTML" if ($Obj eq "h");
X $ftype = "Info" if ($Obj eq "i");
X $ftype = "Mail" if ($Obj eq "m");
X $ftype = "Sound" if ($Obj eq "s");
X $ftype = "Index" if ($Obj eq "w");
X $fhost = $CurrentHost;
X
X $writeme = 1;
X if ((! $opt_d) && ($Obj ne "1") && ($i{$Obj} > $breaklong))
X {
X $writeme = 0;
X $truncated = 1;
X }
X if ($Obj eq "i") { $writeme = 0; }
X if ($opt_d && ($Obj ne "1")) { $writeme = 0; }
X if ($writeme) {
X # First, generate a box
X $cur_x = 10 + 90*$indent;
X
X $wide_x = 6 * length($Name) + 10; # arrived at empiricly
X $right_x=$cur_x + $wide_x;
X $bot_y=$cur_y + 20;
X $middle_x=int($cur_x+$wide_x/2);
X $middle_y=$cur_y+15;
X
X print OUTPUT "6 $cur_x $cur_y $right_x $bot_y\n";
X print OUTPUT "2 2 0 1 -1 0 0 0 0.000 0 0 0\n";
X print OUTPUT "\t$right_x $bot_y $right_x $cur_y $cur_x $cur_y $cur_x $bot_y $right_x $bot_y 9999 9999\n";
X print OUTPUT "4 1 0 12 0 -1 0 0.00000 4 15 49 $middle_x $middle_y $Name\n";
X print OUTPUT "-6\n";
X
X # and then a line to the box
X print OUTPUT "2 1 0 1 -1 0 0 0 0.000 -1 1 0\n\t0 0 1.000 4.000 8.000\n\t$origin_x $origin_y $cur_x $cur_y 9999 9999\n";
X
X $cur_y=$cur_y+30;
X
X }
X
X if ($hostable{$CurrentHost} eq "")
X {
X $hostable{$CurrentHost} =
X unpack("L", (gethostbyname($CurrentHost))[4]);
X }
X
X if (($Obj eq "1") &&
X ($hostable{$CurrentHost} eq $hostable{$firsthost}) &&
X (($Port != $firstport) || ($Path != "")))
X {
X $newindent = $indent;
X $depth++;
X &GetList($CurrentHost, $Port, $Path, $newindent+1,
X ($cur_x+45<$middle_x?$cur_x+45 : $middle_x),
X $bot_y);
# sleep(2);
X $depth--;
X }
X
X $i{$Obj}++;
X }
}
X
# **************************************************************************
# * Main
# **************************************************************************
X
# Parse command line
X
X &Getopts("b:l:dr:h");
X if ($#ARGV < 1)
X {
X print "Usage: gophertree [-d -bn -ln -rn -h] host port [path]\n";
X exit(1);
X }
X
X $firsthost = $CurrentHost = $ARGV[0];
X $firstport = $Port = $ARGV[1];
X $Path = "";
X if ($#ARGV == 2)
X {
X $Path = $ARGV[2];
X }
X
# Initialize some variables
X
X
X $cur_x=10;
X $cur_y=10;
X
X ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = localtime(time);
X $today = $mon+1 . "/" . $mday . "/" . $year;
X
X $^ = 'TOP'; # Report header format
X $~ = 'STDOUT'; # Report body format
X $indent = ""; # Goes in front of description to indicate treeness
X $depth = 1; # How deep into the maze are we?
X $hostable{$firsthost} = unpack("L", (gethostbyname($firsthost))[4]);
X
# CHANGE--User-configurable defaults
X
X $breaklong = 15; # CHANGE--Where to break long lists
X $= = 55; # CHANGE--Lines per page
X $iadd = " "; # CHANGE--Amount to indent each new level
X $nomoreindent = 15; # CHANGE--How far down before we quit indenting
X $maxdepth = 999; # CHANGE--How deep to go before "pruning" layers
X
# Stuff command line changes into the config variables
X
X if ($opt_b) { $breaklong = $opt_b; }
X if ($opt_l) { $= = $opt_l; }
X if ($opt_r) { $maxdepth = $opt_r; }
X if ($opt_h) {
X $^ = 'LOCAL_TOP';
X $~ = 'LOCAL_STDOUT';
X }
X
# Real work
X open(OUTPUT,">raw_gopher.fig");
X print OUTPUT "#FIG 2.1\n80 2\n";
X
X &GetList($CurrentHost, $Port, $Path, $indent,0,0);
X close(OUTPUT);
X exit(0);
X
SHAR_EOF
chmod 0775 gopher2fig ||
echo 'restore of gopher2fig failed'
Wc_c="`wc -c < 'gopher2fig'`"
test 6941 -eq "$Wc_c" ||
echo 'gopher2fig: original size 6941, current size' "$Wc_c"
fi
# ============= links2fig ==============
if test -f 'links2fig' -a X"$1" != X"-c"; then
echo 'x - skipping links2fig (File already exists)'
else
echo 'x - extracting links2fig (Text)'
sed 's/^X//' << 'SHAR_EOF' > 'links2fig' &&
#!/usr/local/bin/perl
X
X
$cur_x=10;
$cur_y=400;
X
open(OUTPUT,">raw_map.fig");
print OUTPUT "#FIG 2.1\n80 2\n";
X
while(<>) {
X ($file) = split;
X
X # First, generate a box
X $wide_x = 6 * length($file) + 10; # arrived at empiricly
X $right_x=$cur_x + $wide_x;
X $bot_y=$cur_y + 20;
X $middle_x=int($cur_x+$wide_x/2);
X $middle_y=$cur_y+15;
X
X print OUTPUT "6 $cur_x $cur_y $right_x $bot_y\n";
X print OUTPUT "2 2 0 1 -1 0 0 0 0.000 0 0 0\n";
X print OUTPUT "\t$right_x $bot_y $right_x $cur_y $cur_x $cur_y $cur_x $bot_y $right_x $bot_y 9999 9999\n";
X print OUTPUT "4 1 0 12 0 -1 0 0.00000 4 15 49 $middle_x $middle_y $file\n";
X print OUTPUT "-6\n";
X
X $pos_x{$file}=$cur_x;
X $pos_y{$file}=$cur_y;
X $middle_x{$file}=$middle_x;
X $middle_y{$file}=$middle_y+5;
X $line{$file}=$_;
X
X $cur_x = $cur_x + 400;
X if($cur_x > 730) {
X $cur_x = 10;
X $cur_y = $cur_y + 30;
X }
X
}
X
foreach $f (keys %line) {
X $_ = $line{$f};
X ($file,@links)=split;
X
X foreach $dest (@links) {
X if(defined($pos_x{$file}) && defined($pos_x{$dest})) {
X print OUTPUT "2 1 0 1 -1 0 0 0 0.000 -1 1 0\n\t0 0 1.000 4.000 8.000\n\t$middle_x{$file} $middle_y{$file} $pos_x{$dest} $pos_y{$dest} 9999 9999\n";
X }
X }
}
SHAR_EOF
chmod 0775 links2fig ||
echo 'restore of links2fig failed'
Wc_c="`wc -c < 'links2fig'`"
test 1233 -eq "$Wc_c" ||
echo 'links2fig: original size 1233, current size' "$Wc_c"
fi
exit 0

--
:!mcr!: HOME: m...@sandelman.ocunix.on.ca +1 613 788 2600 3853
Michael Richardson WORK: m...@ccs.carleton.ca (Conservation Ecology)
Here is an <A HREF="http://journal.biology.carleton.ca/People/Michael_Richardson/Bio.html">HTML reference</A> to my bio.

0 new messages