Ted Nelson and Xanadu

Friday 3 January 2003This is close to 22 years old. Be careful.

My recent posting about the Semantic Web got me thinking about Ted Nelson. He’s the guy who actually coined the term “hypertext”, way back in the sixties. So what’s the connection? Both Xanadu (Nelson’s yet-to-be-built hypertext system) and the Semantic Web are presented as better than the Web we have, and ideals to strive toward. Both tell us to closely examine the web we have and understand its faults.

But there’s something about Ted Nelson’s insistence that the web has missed the mark that I find really sad. How can anything as wildly successful as the web be completely dismissed as having gotten it wrong? (And how ironic that Nelson’s short writings now are awkward .txt files: not only are they not hyper, they are barely even readable text).

Nelson insists that there are fundamental hypertext facilities missing from the web: deep quotability, two-way links, and side-by-side document comparison.

Deep quotability is the ability to quote one document from another, without copying the text, and without just linking to it. Like all of Nelson’s objections to the web, I guess this is an interesting capability, but I don’t see it as fundamental to hypertext. I think it might even be wrong: when I quote someone, I make a copy of the text, and I don’t have to worry that they’ll change it out from under me. The other style of quoting might be useful, but it isn’t obviously the right choice.

A companion idea to deep quotability is transcopyright, which I don’t fully understand, but is a new permission framework for reusing digital content.

Two-way links are an interesting idea, and have been experimented with. XLink proposes them, though I’m not sure anyone is implementing it yet. Blogs have been experimenting with automated links to referrers that accomplish a two-way effect. TrackBack, Pingback, and various other back-link techniques are common among blogs these days. Compared to the purity of Nelson’s vision, these techniques are hacks, but they are here, and Xanadu is not.

And besides, having back-links created by automated tasks rather than springing fully-formed without any intervention can be a good thing: it lets people do more intelligent back-linking. For example, dive into mark checks all referrers to be sure the page really has a link to him. (The Web’s Missing Links is a good short piece about these technologies).

Side-by-side document comparison I just don’t get at all. I can imagine there are a few people in some specialized domains that might need to do this. It just isn’t something the web is missing.

Ted Nelson is one of the great computer visionaries (tomorrow I’ll say more about that). I just wish he could find a way to bring some of his ideas to fruition, or recognize that others already have. Just sitting on the sidelines and pointing out “mistakes” is a waste of his talents.

Comments

[gravatar]
illuminating, especially so.

Add a comment:

Ignore this:
Leave this empty:
Name is required. Either email or web are required. Email won't be displayed and I won't spam you. Your web site won't be indexed by search engines.
Don't put anything here:
Leave this empty:
Comment text is Markdown.