A brief aside; if one wants to see the means presently available in MediaWiki for math rendering, here is where you look. This seems to be moot, but maybe somebody was wondering.
]]>Okay, so I suppose we have now answered my question:
What are our perspectives and options? Is it in principle conceivable that we can migrate the source-code content of the nLab to a different wiki software? Is that technically possible?
by:
it might be technically possible to some extent, but the expected trouble and disadvantages outweigh the expected advantages.
Instead, the perspective is that we somehow need to find somebody who fixes the existing software (namely two main bugs: 1. the Lab crashes frequently and 2. hyperlinks are handled inefficiently in a way that makes larger pages – such as are currently used by one of the present authors and such as are expected to appear frequently in any Journal application – unfeasible).
The person doing such a fix might be
Andrew Stacey – though the task seems to be nontrial enough that the problem requires considerably more time and energy than Andrew can be expected to spare easily;
Jacques Distler – if we ask him;
or maybe some professional programmer who we could hire and pay for this job.
Is that right? What about the third option? Does this sound reasonable? I’d be willing to pay for this, if that’s what it would take to have the job done soon.
]]>There are definitely other suboptimal things about mediawiki than its math syntax. I recall when we were discussing adding redirects, some of us wanted to do it the mediawiki way, where the redirectING page says where to redirect itself to, but Jacques said no, that’s braindead, you want the target page to specify what names redirect to it. After using the redirects in Instiki that Jacques implemented for a while, I have long since been converted 100% to his point of view, and I would regard it as a great step backwards to have to do redirects “the mediawiki way.”
On the other hand, mediawiki does have some other nice features that are lacking (so far) in Instiki, like a “template language” which has (IMHO) a horrendous syntax but nevertheless enables you to do some quite nice things, and also a bundle of “semantic wiki” extensions that also let you do some very nice things.
]]>Before asking him, are we completely sure that the number of wikilinks is the principal reason for slowness of reloading large cohesive topos-like pages ? Maybe we need to do few tests before.
]]>Zoran, there are lots of possibilities. One thing might be that when instiki checks a link against a page, then it does a read on the database. So each link involves a new database query, rather than loading in a list of all the pages in one go and saving it in memory. I can imagine that if there are only a few links then doing a db query for each is quicker than loading in the whole lot, but at some point this tips and it becomes a drag.
The point is I don’t know! I haven’t spent too much time under the bonnet (aka hood) of Instiki. So I’m the wrong person to ask about this. With Instiki, we know exactly who the right person is: Jacques. We also know that he takes us seriously. With MediaWiki, we wouldn’t know who to ask, we’d have to do all this sort of thing “in house”. Now, while I feel happy hacking the forum software, that’s not on the front line. I feel much more nervous hacking the wiki software - I’d rather leave that to the expert(s).
So I suggest that we send a “feature request” to Jacques, outlining the problem that saving pages with a large number of wikilinks takes an inordinate length of time and asking if he has any ideas on how to speed up the process a little. I’d far rather do that than launch in to the murky waters of migration.
]]>41 Andrew and 40 Urs
The checking of links etc. after saving is needed, in my understanding, just in order to regenerate the page for display. 5000 entries is not 5000 but much more regarding the huge number of page aliases.
I could imagine, that usually the difference between the old page and new page is just one paragraph, so why not using this fact when checking for material. I could imagine that if I work on a large page and do it incrementally, that the system checks again just for the changed links. Of course, the rest of the system could have changed in the meantime, so it may appear incorrect to do that, but I think this is sometimes happening with the current system as well. I change a page by adding a redirect or something and it happens occasionally that another page with the link to the now changed location does not appear correct with the reloading and it happened to me even after changing it (but such incorrect behaviour gets corrected very soon when it happens).
I could imagine an easier hack, where one introduces the fast work option, so that when saving in, automatically all nLab links will be presented as existing in immediate reloading, without checking database. (I could imagine even more radical option to save it without loading it back as a page, just getting a confirmation that the page is received, what is useful when running to a bus, but this is less useful in general.) For incremental work, one could do the save with usual (checked links) redisplay option just once per 10 or 50 updates, depending on working style.
In any case, checking every time everything when doing incremental work may be suboptimal.
]]>But the pages that take a long time to save are those with an enormous number of wikilinks in them. When you save such a page on your web-book, it doesn’t have to cross-check each of those links against 5000 entries in a database. Similarly when you send it by email. I would be amazed if doing the same on MediaWiki was any faster.
That said, as I’ve said before, there may be ways to speed that up. But that’s something you should take up with Jacques, not me.
When you talk about “scaling”, the usual meaning of that is number of users and pages, not the lengths of individual pages!
]]>Wikipedia doesn’t scale because it’s based on MediaWiki, it scales because they have a vast array of servers with load balancing and a load of other stuff.
Is that really relevant to the scaling problems that we have?
a page that is getting a little longer takes minutes to save on the Lab, or at some point fails to save entirely. If I save the same size of document on my web-book, even if I send it around by email, it takes an unnoticeable amount of time. So if this fails to work on our present server, I think it is a problem with the software, and no size of array of servers will help with it.
I mean, the Lab has grown a little larger than a personal notepad, but it is still tiny. The number of users is tiny. The number of page requests is probably tiny. So if we have scaling problems, it is not the kind of scaling that Wikipedia has to deal with. i’d think.
]]>To have MathML, you need full XHTML compliance - at least at present; when HTML5 comes along then it should be easier to mix stuff in (not that I regard that as a necessarily a Good Thing: being compliant with some standard is naturally to be desired). So MediaWiki would need adapting to be compliant and I have no idea how easy that would be. One would need to adapt the itex stuff to a suitable plugin. The tricky part of that is actually already done since MediaWiki uses PHP as does this forum, so a mediawiki plugin would be an adaptation of this forum’s stuff.
However, the issue is not the feasibility of the migration, but rather the rest of the stuff. Wikipedia doesn’t scale because it’s based on MediaWiki, it scales because they have a vast array of servers with load balancing and a load of other stuff. I’m sure it’s possible to have a MediaWiki installation that is a load of rubbish, just as it is possible to have an Instiki installation that is not.
The decisive feature of Instiki is that the version that we use was developed precisely for the situation that we are using it for: putting serious mathematical content on the web. All the bits of Instiki that we actually like would have to be replicated in MediaWiki because MediaWiki is not designed for this purpose. The same goes for just about every other wiki software out there. The fact that Jacques chose Instiki means that it is right for us. One thing we would have to content with in MediaWiki may be that we would have a load of additional stuff that we don’t want and never use, but which is built in to the core and isn’t easy to get rid of.
I don’t know about all the issues with scaling, but from what I’ve read then there’s a lot of misinformation out there. Ruby on Rails is relatively young so a lot of the issues have not been definitively resolved, but also a lot of issues are actually non-issues. An example of a RoR application that seems to be scaling fairly well is Twitter.
(I agree with Eric that there is a lot of confusion in this thread between the different pieces of the puzzle.)
]]>Is that the decisive feature of instiki, that it is properly XHTML compliant? Is there any other wiki software that is? The discussion sounds as if Mediawiki is the only alternative. There must be more, no?
]]>But MediaWiki has also some other ways of doing math
That is a good point, but easy to deal with. If there ever is a Mediawiki version of the nLab (and there may never be), the way to deal with multiple maths is to just turn off everything except itex. Since the nLab is for nPeople, we do not need to accommodate existing Wikipedia material. Everything would be directly ported with itex and any new material would be done with itex as usual.
I could be wrong, but if the day ever comes where you need to outsource the nLab server administration, e.g. Andrew wins the lottery and resigns, I imagine it would be much easier to find someone who could administer Mediawiki than Instiki.
I’m tempted to write an article on the “economics of mathematics” because you really do not have the luxury of resources to accommodate the perfect environment and will need to make compromises. You (a general “you” to everyone reading this) are probably not as aware as you should be of how lucky you are to have someone like Andrew around. Without him, this place would crash to the ground in a heartbeat.
PS: From what I understand from previous discussions with Jacques, it is a fairly massive project to clean up Mediawiki enough so that it can serve clean XHTML. The chances of it happening are very small unless there was a giant push from a lot of people, but I don’t get the feeling that will ever happen. So this conversation is probably moot anyway.
]]>If someone were to develop an itex2mml plugin for Mediawiki, then the mathematics rendering we all love so much would also be available on Mediawiki in exactly the same form we all love
Eric, thanks you for explaining some things which I was kind of aware but did not really analyse the proper way. But you see, you are talking about importing our code into MediaWiki. But MediaWiki has also some other ways of doing math and so on. So to get back to LaTeX we will need to simultaneously allow also other ways. And other users which you seem to target by wanting to take the publicly popular way will use it right away. This would make the code more uneven, less typed, harder to back translate. Finally, the concern is will other wikis be equally handy for local light offline install. The problem with restart will certainly be solved relatively short term. Long term I did not hear still any advantage FOR US. I was arguing that being the same as wikipedia etc. is not really a service to the world. Slight diversity will just push different systems to evolve…
]]>Instiki is simply a wiki software (similar to Mediawiki) that is intended to be lightweight and easy to install and use.
I was wondering about this before: if one looks around the web for “Instiki”, one sees that it is all avertized as being the wiki software that is
easy to install
lightweight.
This is not quite the relevant selection criterion for the wiki software we really need and plausibly the source of all our troubles.
Googling for “Instiki scaling” produces very little (for instance this), and nothing indicating that it is supposed to scale well.
]]>zoran said:
Instiki is much closer to LaTeX and wikipedia does not have an aim to get easily transferrable between article LaTeX varieties and wiki mode. With permanence of LaTeX, not wikipedia but rather instiki-like software is longer term solution for us.
I suspect that we probably agree in principle, but there are some semantic issues.
First, Instiki on its own has absolutely nothing to do with LaTeX. Instiki is simply a wiki software (similar to Mediawiki) that is intended to be lightweight and easy to install and use.
To get the nice mathematics to appear on an Instiki wiki, you need the ultra cool itex2mml plugin that Jacques developed.
All the praise you are bestowing on Instiki is actually praise that should be bestowed on the itex2mml plugin.
It is the itex2mml that provides the near perfect (if not perfect) compatibility among the nCafe, the nLab, and the nForum. It is the itex2mml that is now available for Wordpress that is seamlessly used as a backup when the nCafe is down for some reason or another.
If someone were to develop an itex2mml plugin for Mediawiki, then the mathematics rendering we all love so much would also be available on Mediawiki in exactly the same form we all love. In other words, the plugin for Mediawiki would provide near perfect (if not perfect) compatibility between the Instiki version of nLab, a Mediawiki version of nLab, the nCafe, a Wordpress version of the nCafe, and the nForum. They would all be 100% compatible in terms of the syntax and mathematics rendering. The only challenge would be to handle some of the non-mathematics functionality.
itex2mml is an excellent short, medium, and long terms solution. I don’t see any reason to ever consider migrating away from itex. It is perfectly suited for anything we want to do. The only thing that is obvious is that the Instiki wiki software is not a viable long term solution (although, as I said, it is a fine short to medium term solution). A basic requirement before you should consider a migration is that whatever you migrate to should support itex2mml. Currently, Mediawiki does not, so currently, there is no reason to even consider Mediawiki as an alternative to Instiki. Only after an itex2mml plugin for Mediawiki becomes available will there be any viable alternative so we should all be happy sticking with Instiki until that day arrives.
]]>Wikipedia software has a syntax which is quite awkward in my opinion (I wrote some articles both in English and in Croatian wikipedia). Eric, LaTeX has been written in 1980s by Donald Knuth. The consistent good software is longer term than various “revolutionary” html-s which are changed all the time. Instiki is much closer to LaTeX and wikipedia does not have an aim to get easily transferrable between article LaTeX varieties and wiki mode. With permanence of LaTeX, not wikipedia but rather instiki-like software is longer term solution for us. For me it is much MORE important to be compatible with article LaTeX than with wikipedia which does not accept original material and is for research level stuff pretty sterile, and keeps only published “documented” code.
The fact that MediWiki is widespread does not mean it is any easier to maintain. I have a friend who used several wiki softwares and for her choosing which one to take is a matter of decision. In few years, I am sure, Urs’s project will be advanced enough to get some money for support and eventually it won’t be that dependent on Andrew’s time. With public sourcecode etc. there is no need for popularity of some software to be usable. C++ is extremely popular bug-processing language. ML programming language (wikipedia: ML programming language) is a strong typing language for which the debugging time for an experienced user is about 10% of the debugging time in C++ code. My university teacher in programming languages C. Fischer said: “If you wrote a C++ code you debug it. If you write it in ML, you use it.” So you see the ease and practicality of use have nothing to do with proleteriat’s popularity. If the drunkers on the street drink unhealthy fortified wine and only few people drink herbal tea, that does not make fortified wine better. If the migration of that type happens, I will probably install local copy in instiki for me, and split off from the project. Certainly it appeals more to me than to have, for example, the display of wikipedia with ugly math symbols of unequal size; with lack of our specific and superb adaptations like floating tables of contents.
Finally, I do not think that having a single software option in the internet is good. The different strengths make people developing parallel systems envy (in positive sense) each other’s capabilities and improve them. It is also good to have our community recognizable and independent “authority” (from wikipedia say). It is like asking that Bourbaki changes the system of paragraph numeration and sectioning to be somehow compatible with Mathematics Encyclopaedia or its layout with Springer’s LNM series. I like to have personalities of books distinguished. For uniformity there is army service, I had it for 12 months and it was enough. Who wants it, most countries accept volunteers. (I am extremizing the metaphore, just to make it visible).
P.S. On the other hand, having iTeX plugin for MediaWiki is welcome, but I do not think it is a wise thing for scarce community of instiki users to spend time with. We have enough of our own needs and problems (e.g. backlink support between nLab and nForum, that is search for links in nLab to a specific site in nLab including the aliases).
]]>@ Toby #31:
I would say that is the only migration that would make sense. As long as Wikipedia exists (and is open - which is a risk if they IPO) there will be developer support for MediaWiki. Furthermore, I would say the migration is not only conceivable, but inevitable. Instiki is more than sufficient for the near to mid term so there is no rush. The key step that needs to occur before any migration can even begin being taken seriously is that this iTeX plugin for MediaWiki needs to be developed. If you were going to solicit technical help, I would probably focus on getting that plugin built. It would not only be a powerful contribution to the nLab, but also to Wikipedia and all of online mathematics.
At some point during the next several years, Andrew just might get busy enough with other things that he no longer wishes to dedicate significant portions of time to the administration of the nLab. If the nLab were a commercial enterprise, I would describe the situation as a significant operational risk. On the other hand, MediaWiki is so widespread that it would be relatively straightforward to outsource the wiki admin if needed.
However, no plugin = no migration.
]]>@ Urs #25:
Someone could write an iTeX plugin for MediaWiki; that is conceivable.
We’d also have to change all of our markup from Instiki to MediaWiki; someone could conceivably write a script to do that.
]]>Mike, I guessed as much. I only brought it up to point out that if anyone else is thinking of setting up a mathematical blog and likes using iTeX then We Have the Technology.
]]>Well, I didn’t get to choose where to host the HoTT web site. (-: I only brought it up to point out that iTex is better than other options out there.
]]>For one thing, if you want to be sure that you can point someone to a page on the nLab without worrying about whether or not the nLab is “live”, […]
Yes. I think this is important.
[…] we can have a static copy that is updated once a day from the live version.
Okay, that sounds good!
]]>Mike: Yes (which partly answers Urs’ question; to get Wordpress to work with iTeX, I had to do some modification of the core. Not a lot, but I’m not convinced that I’ve done it all.). On the other hand, you can get a blog with full iTeX and MathML support right here.
Urs: Technically, it’s possible. I’ve never said that it wasn’t. But (at the moment), MathML requires proper XHTML compliance and most software is a bit slack in that. I had to hack Wordpress to make it XHTML compliant. Fortunately for this place, Vanilla was already compliant so I didn’t have much to do to adapt this place - but even then, you may remember that I had a test site up for a while and a few very helpful people helped me to debug it.
But first, as I said, there’s a lot of things to try. For one thing, if you want to be sure that you can point someone to a page on the nLab without worrying about whether or not the nLab is “live”, we can have a static copy that is updated once a day from the live version.
]]>@Andrew: But that would only be with a custom local install of Wordpress, not with a hosted site, right?
]]>Mike, you can always use iTeX in Wordpress!
So even if we don’t migrate, can we maybe still try to answer my question that started this thread:
Is it in principle conceivable that we can migrate the source-code content of the nLab to a different wiki software? Is that technically possible?
It seems not inconceivable – in principle – to migrate the nLab content with the math-typesetting that we are used to, to another wiki environment. If it works in Wordpress, can’t one make it also work in, whatever its called, Mediawiki or such?
]]>Mike, you can always use iTeX in Wordpress!
]]>