[identity profile] broin.livejournal.com 2009-08-31 01:46 pm (UTC)(link)
AND, what ideas behind Xanadu are silly? Silly from a user POV ('I just want it to do what I want') or from a database developer ('It can't be done')?

[identity profile] broin.livejournal.com 2009-08-31 04:08 pm (UTC)(link)
Oh, from an ontological point of view, it's daft. ;)

But then you'd have no problem with citations and attribution, fewer problems with copyright and plagiarism... from a strictly academic user's perspective, it'd be amazing.

But then it's almost the opposite of small pieces, loosely joined. And that I'd have to cite who said that and where and when before submitting this post would also reduce the poetry, somewhat.

A Xanadish approach would have been interesting during 1993 - 1996, for me, as pages got lost and were deleted a lot, and the spiders couldn't keep track. Doesn't happen so much any more.

[identity profile] nmg.livejournal.com 2009-08-31 06:14 pm (UTC)(link)
It's a massively locked down system - because they focussed on 'correctness' rather than 'utility'.

This is the standard argument that the Web 'works' because of the 404 error. While 404 was something that distinguished the Web from contemporary open hypertext systems (namely Hyper-G and Microcosm) and Xanadu, the Web succeeded more because a) the protocol and data format definitions were freely available, and b) it was initially targeted at a user community rather than being a predominantly research system (as was the case with Hyper-G and Microcosm).

So, for instance, in Xanadu, moving a page means updating all of the links that point to it.

Moving a page? No such thing. Once a document is created with a given ID, it's there for perpetuity. If you want to refer to it by a different identifier, that's what transclusion is for.

[identity profile] nmg.livejournal.com 2009-08-31 08:44 pm (UTC)(link)
Not clear. The Front End-Back End protocol (from Xanadu Green, the version described in Literary Machines) would have pretty straightforward to implement on the client side, mainly because all of the heavy lifting was being done by the server. The Back End-Back End protocol (which was what I had to ask Ted about), which would have been key to the implementation of the servers, is a different matter.

The other key to the server side is the enfilade data structure. Ted didn't publish anything about enfilades until fairly recently (in the Udanax source release, as source code) because he believed them to be such a good idea that they were worth retaining as a trade secret. I haven't implemented enfilades, but from what I can tell, they should be no harder to implement than any other moderately complex hierarchical data structure.

[identity profile] nmg.livejournal.com 2009-08-31 08:56 pm (UTC)(link)
As an aside, I'm increasingly struck that in the Web of the early 1990s (pre-Netscape) it was much easier to write browsers and servers than it is now.

HTTP/1.0? Doddle. HTML 2? Still pretty easy (and easier yet if you took the lazy route and didn't try to parse it as SGML first). No CSS, SVG, Javascript/ECMAscript, Flash. By 1995, I'd written special-purpose standalone servers and simple clients. I wouldn't want to try that now.

[identity profile] nmg.livejournal.com 2009-08-31 10:09 pm (UTC)(link)
Don't get me started on HTML5 (and its standards process(.

[identity profile] nmg.livejournal.com 2009-08-31 10:15 pm (UTC)(link)
It verges on ad hominem, so is best delivered in person.