Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

WWW-theory: Hyperbolic linking

1 view
Skip to first unread message

Jorn Barger

unread,
Dec 7, 2000, 3:00:00 AM12/7/00
to
When hypertext has a hierarchical tree-structure, links between nodes
can be described using genealogical terms: parent, child, sibling,
grandparent, cousin, etc.

Hypertext theory usually recommends that individual nodes link to a
pretty limited set of these relatives: all the node's children, its one
parent, one next-sibling, one last-sibling, maybe one grandparent. (The
idea of 'path' or 'breadcrumbs' is parent-grandparent-greatgrandparent,
for example.)

To take full advantage of the Web, though, strict hierarchies are too
limiting-- what you have is a web of relationships with multiple,
arbitrarily overlapping hierarchies.

For any given node, surely, the right question to ask with regard to
linking is: how _close_ is the topic of node N to the topic of my
current page.

By this standard, a given node should probably link _all_ its siblings,
and quite possibly all its cousins, because these will be thematically
close.

But more generally, we might compare this to the idea of 'hyperbolic'
magnification, as used in some computer interfaces: things that are
closer (more central) are magnified more, those farther away are
magnified less. (The MacOSX 'dock' behaves somewhat hyperbolically.)

The current fashion in webpage design, especially for large
magazine/news/portal sites, is to weigh down every article with hundreds
of marginal links to utterly unrelated parts of the site-- very
unhyperbolic, and I think utterly wasteful-- who ever clicks any of
those? Who even looks at them?

More and more now, happily, savvy news sites are ***adding value*** by
including a set of links to related topics-- recent news on the same
subject, pieces by the same author, etc.

This starts to look more hyperbolic, but to really execute the paradigm
adequately I think 80% of the links on a page should be to the closest
topics-- and should exhaustively link _all_ the closest possible ones.
('Closest' here should probably be read as 'most-useful-to' rather than
most-redundant-with.)

My current practice with my James Joyce pages is to start each new page
from a template that already has 100 likely links sorted at the bottom.
(Eg: http://www.robotwisdom.com/jaj/ulysses/eccles.html )

In theory I might trim some of the more distant ones, but in practice I
don't even see much reason to-- 100 text-only links at the bottom of a
page with no TABLEs don't noticeably slow down loading, and by keeping
them consistent from page to page people can grow to rely on them.

The reductio ad absurdum here would be to include my full sitemap at the
botom of every page-- with many hundreds of pages this would become a
problem, but the hyperbolic solution seems generally elegant in solving
it.

--
http://www.robotwisdom.com/ "Relentlessly intelligent
yet playful, polymathic in scope of interests, minimalist
but user-friendly design." --Milwaukee Journal-Sentinel

colin

unread,
Dec 7, 2000, 3:00:00 AM12/7/00
to
http://www.useit.com/alertbox/20000109.html

that's all i'm saying :)
Philip Stripling <phil_st...@cieux.zzn.com> wrote in message
news:w3qzoi8...@shell.tsoft.com...
> Jorn Barger wrote, in pertinent part:


>
> > The current fashion in webpage design, especially for large
> > magazine/news/portal sites, is to weigh down every article with hundreds
> > of marginal links to utterly unrelated parts of the site-- very
> > unhyperbolic, and I think utterly wasteful-- who ever clicks any of
> > those? Who even looks at them?
>

> They are following the theory of hypobolic linking, Jorn.
>
> --
> Philip Stripling | email to the replyto address is presumed
> Legal Assistance on the Web | spam and read later. email to philip@
> http://www.PhilipStripling.com/ | civex.com is read daily.
> Resources for small businesses, entrepreneurs, and legal professionals.

Jorn Barger

unread,
Dec 7, 2000, 3:00:00 AM12/7/00
to
colin <cmc...@usa.net> wrote:
> http://www.useit.com/alertbox/20000109.html
> that's all i'm saying :)

i did a long critique of that one at the time:

http://www.deja.com/=dnc/getdoc.xp?AN=570531997

Philip Stripling

unread,
Dec 7, 2000, 12:19:00 PM12/7/00
to

Chris Hubbard

unread,
Dec 8, 2000, 3:00:00 AM12/8/00
to
Jorn,
Your critique is interesting and thought provoking, but it's lacking
reference material to back up your assertions.
I've also conducted both Heuristic and more traditional usability
studies over the last three years. In that time I haven't seen
anything that would directly contradict what Mr. Nielson is saying.
Occasionally, I'm provided a new interpretation, but mostly it's
consistent. The testing that I've performed is based on the same work
that Nielson credits, namely Jared Spool. It is possible, though in
my unsubstantiated opinion, highly unlikely that the methodology has
that much influence on the results.

Again I'll state that your opinions are interesting and thought
provoking and could be crafted into a usability study (or studies).
But your lack of citations, or raw data to prove your statements means
that (for now) I have to disregard your opinions entirely.

Chris Hubbard

unread,
Dec 8, 2000, 3:00:00 AM12/8/00
to
On Thu, 7 Dec 2000 03:07:11 -0600, jo...@mcs.com (Jorn Barger) wrote:
>
>For any given node, surely, the right question to ask with regard to
>linking is: how _close_ is the topic of node N to the topic of my
>current page.

I don't agree with this statement. Linking is too user specific. If
I've got a story to tell then I either craft it so it can be read
along multiple paths or along a single path. If I'm presenting data
about a product, the progressive display of increasingly technical
data seems to work for a significant population, i.e. those that don't
use search.
I also don't understand the relevance of the question. I think that
regardless of how the question is answered, I will continue to link in
ways that I guess/think are most representative of the conceptual
model that my primary audience is using. What I think is related
isn't necessarily what you think is related. In the absence of a
universal meta keyword database (besides Yahoo), it's all going to
ultimately work out as a subjective call.

colin

unread,
Dec 8, 2000, 3:26:30 AM12/8/00
to
can you support any of your arguments. Link to user studies you've carried
out?

regards


c

Jorn Barger <jo...@mcs.com> wrote in message
news:1el9xt8.wau...@207-229-150-216.d.enteract.com...


> colin <cmc...@usa.net> wrote:
> > http://www.useit.com/alertbox/20000109.html
> > that's all i'm saying :)
>
> i did a long critique of that one at the time:
>
> http://www.deja.com/=dnc/getdoc.xp?AN=570531997
>
>

Aaron C

unread,
Dec 10, 2000, 3:42:57 AM12/10/00
to
I hate to sound, uh, lame, but this post makes my head hurt. what are you
talking about in semi-layman's terms?


--
--Aaron C


"Jorn Barger" <jo...@mcs.com> wrote in message

news:1el903n.pdn...@207-229-151-171.d.enteract.com...


> When hypertext has a hierarchical tree-structure, links between nodes
> can be described using genealogical terms: parent, child, sibling,
> grandparent, cousin, etc.
>
> Hypertext theory usually recommends that individual nodes link to a
> pretty limited set of these relatives: all the node's children, its one
> parent, one next-sibling, one last-sibling, maybe one grandparent. (The
> idea of 'path' or 'breadcrumbs' is parent-grandparent-greatgrandparent,
> for example.)
>
> To take full advantage of the Web, though, strict hierarchies are too
> limiting-- what you have is a web of relationships with multiple,
> arbitrarily overlapping hierarchies.
>

> For any given node, surely, the right question to ask with regard to
> linking is: how _close_ is the topic of node N to the topic of my
> current page.
>

> By this standard, a given node should probably link _all_ its siblings,
> and quite possibly all its cousins, because these will be thematically
> close.
>
> But more generally, we might compare this to the idea of 'hyperbolic'
> magnification, as used in some computer interfaces: things that are
> closer (more central) are magnified more, those farther away are
> magnified less. (The MacOSX 'dock' behaves somewhat hyperbolically.)
>

> The current fashion in webpage design, especially for large
> magazine/news/portal sites, is to weigh down every article with hundreds
> of marginal links to utterly unrelated parts of the site-- very
> unhyperbolic, and I think utterly wasteful-- who ever clicks any of
> those? Who even looks at them?
>

> More and more now, happily, savvy news sites are ***adding value*** by
> including a set of links to related topics-- recent news on the same
> subject, pieces by the same author, etc.
>
> This starts to look more hyperbolic, but to really execute the paradigm
> adequately I think 80% of the links on a page should be to the closest
> topics-- and should exhaustively link _all_ the closest possible ones.
> ('Closest' here should probably be read as 'most-useful-to' rather than
> most-redundant-with.)
>
> My current practice with my James Joyce pages is to start each new page
> from a template that already has 100 likely links sorted at the bottom.
> (Eg: http://www.robotwisdom.com/jaj/ulysses/eccles.html )
>
> In theory I might trim some of the more distant ones, but in practice I
> don't even see much reason to-- 100 text-only links at the bottom of a
> page with no TABLEs don't noticeably slow down loading, and by keeping
> them consistent from page to page people can grow to rely on them.
>
> The reductio ad absurdum here would be to include my full sitemap at the
> botom of every page-- with many hundreds of pages this would become a
> problem, but the hyperbolic solution seems generally elegant in solving
> it.
>
>
>

Jorn Barger

unread,
Dec 10, 2000, 8:23:55 AM12/10/00
to
Aaron C <akidi...@home.com> wrote:
> what are you talking about in semi-layman's terms?

Paraphrase:

"When you put an article on the web, make sure to offer lots of
of links _at the end_ to other articles your readers might like.
If you don't, you're wasting a big opportunity. ('Hyperbolic'
means the closest topics get the most links.) If you don't use
TABLEs, and if you sort them neatly, even 100 'footer links'
will be fine."

John Distai

unread,
Dec 10, 2000, 10:43:30 AM12/10/00
to
That was much more usable!

"Jorn Barger" <jo...@mcs.com> wrote in message

news:1elexcg.1ms...@207-229-151-176.d.enteract.com...

Michael Hoffman

unread,
Dec 10, 2000, 10:35:19 PM12/10/00
to
Good thread, wish I had time to participate now. If there is one sure-thing
web navigation design I promote, it's to *have* a detailed hierarchical
sitemap page somewhere, and then to *link* to this page from every page of
the site. As a slower alternative, every page can link to the home page,
which then has a prominent link to the sitemap.

Glad to see your dedication Jorn, and hope to catch up with your writings
perhaps in a year.

-- Michael Hoffman
http://www.hypertextnavigation.com


colin

unread,
Dec 11, 2000, 4:26:57 AM12/11/00
to
If i'm not missing something both Jacob and Jorn advocate Heirachical
linking and not 'spoke' linkages.

But Jorn, if you're going to diss someone, please back up with some hard
evidence - something which neilsen has and you definately do not.

Francis

unread,
Dec 11, 2000, 9:35:33 AM12/11/00
to
"Hundreds" of links may be too much, but providing a list of links to the
top chapters not only improves navigation, it also gives a perception of the
breadth of the site (see Amazon - you are looking for a book and you see all
those Music - Gardening - Travelling tabs). It can also be used to tell you
where you are. Eliminating these because "unrelated" to the current page
would be silly, don't you think?

Francis

Having
"Jorn Barger" <jo...@mcs.com> a écrit dans le message news:
1el903n.pdn...@207-229-151-171.d.enteract.com...

Jorn Barger

unread,
Dec 11, 2000, 9:16:13 AM12/11/00
to
colin <cmc...@usa.net> wrote:
> If i'm not missing something both Jacob and Jorn advocate Heirachical
> linking and not 'spoke' linkages.

'spoke' locally, 'hierarchical' globally. (ie, 'hyperbolic')

> But Jorn, if you're going to diss someone, please back up with some hard
> evidence - something which neilsen has and you definately do not.

JN's 'hard evidence' fooled him into asserting for years that users
don't scroll. There really is no hard evidence in the social sciences,
imho-- it's all a matter of better or worse experimental design, and
what looks hard today will look soft when we know more.

My evidence comes from daily experiments with hundreds of readers, via
my server logs and email feedback. Nielsen's experiments involve tens of
readers and tens of experiments. QED!

colin

unread,
Dec 11, 2000, 1:15:45 PM12/11/00
to

> JN's 'hard evidence' fooled him into asserting for years that users
> don't scroll. There really is no hard evidence in the social sciences,
> imho--

OUCH!


> My evidence comes from daily experiments with hundreds of readers, via
> my server logs and email feedback. Nielsen's experiments involve tens of
> readers and tens of experiments. QED!

could you elaborate on that J?

http://www.useit.com/alertbox/20000319.html

I love this thread!!


Darin McGrew

unread,
Dec 11, 2000, 6:29:04 PM12/11/00
to
Jorn Barger <jo...@mcs.com> wrote:
> My evidence comes from daily experiments with hundreds of readers, via
> my server logs and email feedback. Nielsen's experiments involve tens of
> readers and tens of experiments. QED!

You seem to accept the common assumption that more data means better data.
That isn't necessarily so. If the data-collection mechanism biases the
data, then collecting more data points isn't going to eliminate the bias,
and isn't going to improve the quality of the data.
--
Darin McGrew, mcg...@stanfordalumni.org, http://www.rahul.net/mcgrew/
Web Design Group, da...@htmlhelp.com, http://www.htmlhelp.com/

"Nothing is so good as it seems beforehand." -- George Elliot

Jorn Barger

unread,
Dec 11, 2000, 7:57:01 PM12/11/00
to
Darin McGrew <mcg...@stanfordalumni.org> wrote:
> You seem to accept the common assumption that more data means better data.
> That isn't necessarily so. If the data-collection mechanism biases the
> data, then collecting more data points isn't going to eliminate the bias,
> and isn't going to improve the quality of the data.

What bias do you think hundreds of hits daily from random web surfers
introduce? One test subject is plenty if they're articulate and give
useful feedback. Hundreds add the advantage of dozens of different
platforms and goals. And I'm not running the same experiment every
time-- every new page I create tries new things based on previous
feedback.

my hypertext design lab: http://www.robotwisdom.com/web/

Jukka Korpela

unread,
Dec 12, 2000, 10:20:23 AM12/12/00
to
jo...@mcs.com (Jorn Barger) wrote:

>What bias do you think hundreds of hits daily from random web surfers
>introduce?

You seem to have difficulties with the concept of randomness, which is
_crucial_ in reliable studies based on sampling. "Random" Web surfers
aren't. Except in a "random" meaning of "random". See Statistics 101.

Followups narrowed according to normal Usenet practice.
--
Yucca, http://www.hut.fi/u/jkorpela/
Qui nescit tacere nescit et loqui

Michael Stutz

unread,
Jan 3, 2001, 5:22:57 PM1/3/01
to
In article <1el903n.pdn...@207-229-151-171.d.enteract.com>,
Jorn Barger <jo...@mcs.com> wrote:

>To take full advantage of the Web, though, strict hierarchies are too
>limiting-- what you have is a web of relationships with multiple,
>arbitrarily overlapping hierarchies.

Because there's no "up" or "down" on the Web ...


>In theory I might trim some of the more distant ones, but in practice I
>don't even see much reason to-- 100 text-only links at the bottom of a
>page with no TABLEs don't noticeably slow down loading, and by keeping
>them consistent from page to page people can grow to rely on them.

The only problem with this is that it could retard searching, if the
search engine can't ignore your template -- say if I'm searching your
site for topic x, which happens to be one of those 100 links, it may
give me all of your pages with the template ...


Jorn Barger

unread,
Jan 4, 2001, 12:26:12 AM1/4/01
to
Michael Stutz <m...@dsl.org> wrote:
> >In theory I might trim some of the more distant ones, but in practice I
> >don't even see much reason to-- 100 text-only links at the bottom of a
> >page with no TABLEs don't noticeably slow down loading, and by keeping
> >them consistent from page to page people can grow to rely on them.
>
> The only problem with this is that it could retard searching, if the
> search engine can't ignore your template -- say if I'm searching your
> site for topic x, which happens to be one of those 100 links, it may
> give me all of your pages with the template ...

Yeah, atomz.com introduced a handy <NOINDEX> tag for that, but I haven't
gotten around to using it.

(Did you do the clicktracking experiment?)

Michael Stutz

unread,
Jan 4, 2001, 12:38:53 PM1/4/01
to
In article <1emomch.1x2chkm6r2kvN%jo...@mcs.com>,
Jorn Barger <jo...@mcs.com> wrote:

>(Did you do the clicktracking experiment?)

Yeah. I didn't keep it up for a full week though because the lag time
between click and pageload was too annoying. But the results have been
positive -- I got more mail re: the design than I'd *ever* gotten
before.

With this design, I wanted to cram as many links in a page as
possible, while keeping it easy to scan on the screen.

To this end, I just did another experiment this morning, counting the
number of links in the 10 most popular weblogs. Yours came out on top,
as I predicted, with 481 links (or roughly 2.5x more than slashdot,
and still 75+ more links than metafilter, which is good considering
yours is a one-man site). But the mean average was much higher than I
thought it'd be: 217. Only 20% of them had less than 100 links; at 121
links, I guess I've still got plenty of room!

0 new messages