Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics comma complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration finite foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorUrs
    • CommentTimeApr 17th 2011

    What are our perspectives and options?

    Is it in principle conceivable that we can migrate the source-code content of the nnLab to a different wiki software? Is that technically possible?

    • CommentRowNumber2.
    • CommentAuthorEric
    • CommentTimeApr 17th 2011

    The most obvious candidate for any kind of migration would be Mediawiki. I wonder if anyone has made progress with itex and MediaWiki since the last time I tried?

    • CommentRowNumber3.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 17th 2011

    Why?

    • CommentRowNumber4.
    • CommentAuthorFinnLawler
    • CommentTimeApr 17th 2011

    Well, one problem is reliability – every so often the nLab stops responding and has to be manually restarted, which is clearly a Bad Thing. Another is the question of input formats, which will become more important if and when the nJournal gets going. I know Instiki and Jacques have been very good to us, but as the Lab grows and expands we seem to be running into more of Instiki’s limitations, so it’s certainly not a bad idea to consider other options.

    The problem I’d have with MediaWiki is that its maths support seems to be quite poor, which for a maths wiki is a pretty big stumbling block. One alternative might be Gitit, which is a wiki engine written in Haskell that uses either Git or Darcs as its version control system. It uses Pandoc for markup processing, which means that it supports Markdown and LaTeX (among others) out of the box. And it has what looks like a very nice plugin system that could help with our XYPic/TikZ diagram woes.

    • CommentRowNumber5.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 17th 2011

    The problem with reliability is nothing to do with the software and everything to do with the fact that our technical support consists of one person who doesn’t really know what he’s doing and has other things to do as well.

    From what I’ve read about pandoc, it is basically another markdown+some-subset-of-latex-for-mathematics. So I’m not sure what problems it would solve. The key line on the pandoc homepage is (emphasis mine):

    Pandoc can read markdown and (subsets of) reStructuredText, textile, HTML, and LaTeX,

    when I read pages about projects that use pandoc, such as that of gitit, they seem to miss that key two words:

    . Pandoc is used for markup processing, so pages may be written in (extended) markdown, reStructuredText, LaTeX, HTML, or literate Haskell,

    The real problem with xypic/tikz is figuring out what the output should be. There isn’t yet a good marriage between SVG and MathML, so combining a picture with text is, with current technology, not reliable. Of course, one could always convert to an image, but that is not a good solution. For pure diagrams, I’ve been using the tikz->SVG conversion from the tex4ht suite and am very happy with the results.

    I’d prefer to know what these “limitations” are and consider how to overcome them before talking of something as major as migrating the software.

    • CommentRowNumber6.
    • CommentAuthorFinnLawler
    • CommentTimeApr 17th 2011

    I’m not saying (and I’m sure Urs and Eric don’t mean this either) that there’s an immediate and pressing need for such a migration, just that it’s worth thinking about alongside any other future plans.

    … our technical support consists of one person who doesn’t really know what he’s doing and has other things to do as well.

    That’s part of the point, I think – as deeply appreciated as your efforts are, the fact is that none of us is a technical expert, and it may happen that at some point in the future the benefits of migration will outweigh the costs. So it couldn’t hurt for interested parties to experiment with different setups.

    • CommentRowNumber7.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 17th 2011

    I presume that Urs had a reason for digging up this issue, and I’m waiting to hear what it is. I shan’t comment on what Eric said.

    As to the rest, well, that’s my point. If the current system can only produce one lame-brained techie, what hope do we have that a proposal for migration is going to suddenly produce more experts out of the woodwork? Any if it is, why aren’t they helping out now? There is no other system that is set up to properly present mathematics on the web, so any target of a migration would need customisation, and then we’re back with the basic problem: tech support by someone who makes it up as he goes along.

    (I just took a closer look at pandoc. Sadly, it suffers from the “all things to all men” syndrome. In particular, at first sight it looks vulnerable to the \newcommand{\a}{\a}\a attack.)

    I’m sorry if I sound a little … irritated. I am, but I don’t mean to take it out on anyone in particular.

    • CommentRowNumber8.
    • CommentAuthorUrs
    • CommentTimeApr 18th 2011

    I fear the process of migration and wish we would rather not. But every now and then I feel unable to ignore my frustration with the software that is taking care of so many hours of my and your work. Yesterday was such a point. I got up Saturday morning and had 2 fine hours in quiet to do some work on the Lab, but I ended up spending one hour with fiddling around with how to save the entry I was working on, which wouldn’t save.

    I don’t mean the bunch of small bugs, but the problem that the software fails to reliably perform the most basic task of a wiki: to save and display content. I am speaking of two glaring problems:

    1. The server goes down about 1.5 times per day and apparently lately it is mostly me who restarts it. That means I cannot with a straight face point people to nnLab pages. I should rather say: wait until I have restarted the server, then you can maybe see the entry.

    2. The more I care about a page, the less does it save and display: as pages grow, the time it takes to save and display them quickly raises to the order of minutes. And as they grow further, the pages eventually fail to save entirely, one has to break them up to bring them back to the save-within-minutes status.

    That is frustrating. This is using up loads of precious time, and has been doing so for years. I can’t quite imagine that other math wikis, such as the Manifold Atlas have a similar problem. And is this about the number of technicians available in the background? Is this not just a question of stability of software?

    On the other hand, I am glad we have this software at all and fear the trouble of switching. Today is a new, brighter day, and I am feeling more lightly about this. Maybe today I would not have started this thread.

    • CommentRowNumber9.
    • CommentAuthorDavid_Corfield
    • CommentTimeApr 18th 2011

    Would it be worth looking for funding for technical support? Is it that a technical wizard could do some intensive work so that the wiki would be easier to manage for some time, or is it that less concentrated continuing support would be ideal?

    • CommentRowNumber10.
    • CommentAuthorUrs
    • CommentTimeApr 18th 2011

    My impression is that the problem is not one of support or maintenance. Andrew has been doing an awesone job at this. But the problem seems to be that somebody would need to dive into the code and do some genuine re-programming. Apparently the software was not designed to scale to nearly the scale at which we are using it (both concerning size of single pages, as well as concerning overall size and memory usage, which has been a major headache since the very beginning).

    • CommentRowNumber11.
    • CommentAuthorUrs
    • CommentTimeApr 18th 2011
    • (edited Apr 18th 2011)

    To give an example, which I just went through, keeping an eye on the clock for comparison:

    I wanted to upload the latest version of my writeup to my personal web.

    1. I go to my page. After a minute of waiting for the page to build up, I suspect that something may be going on and try to load the nnLab’s HomePage. It doesn’t load. Aha, so the server is down.

    2. I fire up putty, enter server names, logins, passphrases, wait for the server to respond, then call the restart command. Another minute.

    3. Back to my page. I edit it to set the new filename, then save it (bacause that’s the procedure to make files upload). I wait roughly 45 seconds fo the page to display again.

    4. Now I have access to the upload menu. I enter the data, and hit submit. The file is being uploaded (I can see this in my network’s activity) then there is again the long pause that goes with the page being displayed again, now with the new link to the file. This time it takes well over a minute.

    5. But I am not done yet. There is a bug that has the effect that whenever a file is uploaded and the page displayed again, one needs to make some minor edit to the page and save yet once more, since otherwise the link to the file will appear blank and broken when calling it next. So I go to the edit page again, enter a blank line and then hit save once more. After another minute, the page comes back.

    et voila, I have uploaded one file. All in all it kept me busy about 7 minutes (well, i have leanred to do other things in parallel while waiting for pages to appear).

    I don’t want to complain. I can and did live with this for a long time. But yesterday, when for the second time I got to the point that some (another) page would not display even after many minutes, I came to think: is there an alternative?

    • CommentRowNumber12.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 18th 2011

    I was not aware that the restarting issue was so prevalent. Perhaps we need a thread where we can record how often it has to be restarted. However, whilst unaware of the scale of the problem, I’m quite aware of the fact of the problem - just been too busy to spend any significant time fixing it. But our semester has just finished, so I can take some time to fix it (but it is the Easter week and I’m having a week off).

    The comparison with the manifold atlas is completely ridiculous. They have less than 100 pages (I just counted). Wikipedia might be a better comparison (though there the scaling goes in the opposite direction!), but if you look at the average wikipedia page then you will see far fewer links-per-page as you, Urs, are wont to include (which is one of the major slow-downs when editing a page), far fewer edits-per-page (and it is the regeneration of pages that is the slowest part, if a page is stable then it gets cached and then it is lightning-fast). Lastly, wikipedia pages get split into sections that are included into a main page. There is absolutely no reason why we couldn’t do that.

    (On a side issue, I’m not convinced that the process you describe for uploading a new version of a file is the best method to use there! But that’s by-the-by.)

    • CommentRowNumber13.
    • CommentAuthorUrs
    • CommentTimeApr 18th 2011
    • (edited Apr 18th 2011)

    just been too busy to spend any significant time fixing it.

    Would you know how to fix it? Last time you said you would like to fix it, but don’t know what causes the problem.

    The comparison with the manifold atlas is completely ridiculous.

    Andrew, I find this us of language quite curious. This is not the first time that I am being ridiculed for voicing my impression about the nnLab software (last time not by you). The fact that it happens does not reassure me about my concerns. You have done an awesome job with the software, I think openly discussing and admitting problems that are still there does more good than trying to downplay bugs.

    Apart from that, notice that we had the problem with restarts from the get go. The very evening after I had initially set up the nLab I went home after a train ride and found the nnLab being down. This was the first time I had to restart it. With less than 5 pages there. I do doubt that many other wikis out there are so shaky, because I do doubt that people would use them if they were.

    On a side issue, I’m not convinced that the process you describe for uploading a new version of a file is the best method to use there!

    What I describe is the procedure given at the HowTo pages. Please let us know about whatever improvement over that you are aware of.

    • CommentRowNumber14.
    • CommentAuthorUrs
    • CommentTimeApr 18th 2011
    • (edited Apr 18th 2011)

    but if you look at the average wikipedia page then you will see far fewer links-per-page as you, Urs, are wont to include (which is one of the major slow-downs when editing a page)

    Somewhere here is the problem hidden: what is it that stops a modern computer from dealing with a million links per page? What is it the nnLab software tries to do with the links on a page? For five minutes and more (on the last page I saved)? The software must be trying to do something that it should not be doing.

    And maybe to steer away from the insinuation that I am singularly doing something that just so happens to be unheard of for wiki software: let’s think of the nnJournal application that we were envisioning. I think all the problems that I am pointing out are serious problems for running any nnJournal, should we ever do it.

    • CommentRowNumber15.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 18th 2011

    The restart stuff is a problem, but because I don’t know what is causing it then I don’t know a quick fix. It will take me a little time to track down exactly what is the problem, and that is time that I haven’t had up to now due to my day job (!) - when it’s semester time here then I have to deal with stuff in “bite sized” chunks and this is more than “bite sized” (and the gap between the two semesters is non-existent over Christmas so I didn’t get anything done then). There are a few things that I need to do, starting with making sure that all the software is up to date and closely monitoring the system. I have a few suspicions as to where the problem might lie and need to set things up to monitor each possibility. Again, that takes time. But I’d rather spend the time doing that than rewriting mediawiki!

    I apologise for the word “ridiculous”. The intent behind it was not to ridicule you, but to point out that the Manifold Atlas is nothing compared to us! If it’s any consolation, I just loaded up a MathOverflow page and was confronted yet again with an error message on MathJax - basically it was taking too long to process the page and so my browser wanted to know if I really wanted to continue. Putting mathematics on the internet is still in its infancy, and every system for doing so has its issues. That you don’t see any with the Manifold Atlas is, I think, down to three things: 1) They are a fraction of the size of us, 2) You go there as a reader, not an author so encounter different aspects of the site, 3) You spend far more time here than there so notice things a lot more here than there.

    The issue with the links-per-page probably could be vastly improved. As far as I understand it, whenever you write [[a page name]] then Instiki has to check through the list of pages and redirects to see if a page name is the name of an existent page or non-existent page. If the page has a lot of links, then it does that for each link. As we have about 5000 pages then that’s a lot of checks to be done. But perhaps it does not do this in the most efficient manner - that’s something to talk to Jacques about. So the problem is not with having lots of links, it is with the fact that the manner in which the links are rendered depends on run-time information.

    Lastly (in this comment) let me wholeheartedly agree with your statement:

    I think openly discussing and admitting problems that are still there does more good than trying to downplay bugs.

    What irritated me most about your initial post in this thread was that there was none of that. It is only in these last comments that we have gotten to the actual problems involved.

    • CommentRowNumber16.
    • CommentAuthorUrs
    • CommentTimeApr 18th 2011

    What irritated me most about your initial post in this thread was that there was none of that.

    Okay, I should have added more context. The problems had been announced repeatedly here and there in other threads. Then at some point I just wanted to know: is migration any option at all? In particular since I know well that you are busy and have other things to do than doing a major software fix. Given this it seems like an obvious question: could we migrate the material if we wanted to, to a more stable system, one that does work for people who have other day jobs to care about!

    • CommentRowNumber17.
    • CommentAuthorzskoda
    • CommentTimeApr 19th 2011

    I have often errors with MathJax even on MathReviews, which often display some dots instead of an inline formula (once per about 20-30 pages looked upon), for example. I prefer the instiki, and I think it has a good future. We should stay with instiki, I think. I believe Andrew will find eventually what the problem with restart is. Good that the scale of the problem has been raised up!

    • CommentRowNumber18.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 19th 2011

    Urs, having been busy then I haven’t been aware of exactly how many times the lab has needed restarting, and scattering the announcements around the forum hasn’t made me aware of it. My failing, perhaps. But just because something is clear in your mind doesn’t mean that it is clear in mine! Something you know very well when it comes to mathematics (!) but it applies elsewhere as well.

    I’d rather fix the problems we have than transfer to a new system with a whole new set of problems. Of course, if we can’t fix them then that’s something else, but I’d rather try first. So if someone has the time and energy to put in to considering migration, I’d like to ask them to first try to help us fix the current set of problems!

    • CommentRowNumber19.
    • CommentAuthorUrs
    • CommentTimeApr 19th 2011

    Okay, sounds reasonable.

    Then to come back to David C.’s kind of suggestion further above: can anyone see any chance that in one way or other we find third-party persons willing to look into fixing the problems? Can one hire a programmer to look into this for due payment?

    • CommentRowNumber20.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 19th 2011

    As I said, now that my semester is over then I shall take some time to look in to this. Some the software we’re running needs updating, but to do that properly then I need to take the lab down for a bit and I need to know that I have a few hours clear to install the new versions of the software.

    Once I’ve done that, I’ll start investigating further. I have a suspicion that the problem stems from the logging. We have a particular method of logging, but if I decrease the level of logging then maybe we can get away with the old system - at least to discover if that really is the source of the problem.

    Plus I have a host of other things that I want to sort out that will make running this place much easier: basically, automatic notifications of when things go wrong. Now that we’ve been at the new server for a year and a half, I’m much more aware of what needs doing.

    • CommentRowNumber21.
    • CommentAuthorTobyBartels
    • CommentTimeApr 20th 2011

    The math rendering in Instiki (using iTeX to MathML) is far and away better than anything else that I encounter on the Web. Overall, MediaWiki (or something else) may be better software (I can see arguments both ways), but I would want to keep something that uses iTeX to MathML. At the moment, among wiki software, this seems to be only Instiki.

    • CommentRowNumber22.
    • CommentAuthorMike Shulman
    • CommentTimeApr 21st 2011

    @Toby #21: I agree. Not just in terms of how it looks and renders, but the input syntax of iTeX is better than much of what’s out there. I don’t know what MediaWiki has, but I’ve also been writing on WordPress recently (the HoTT blog) and it irks me having to write $latex ...$ there instead of just $...$.

    • CommentRowNumber23.
    • CommentAuthorTobyBartels
    • CommentTimeApr 21st 2011

    In MediaWiki, you write <math></math>.

    • CommentRowNumber24.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 21st 2011
    • (edited Apr 21st 2011)

    Mike, you can always use iTeX in Wordpress!

    http://www.math.ntnu.no/~stacey/Vanilla/WPMathML/

    • CommentRowNumber25.
    • CommentAuthorUrs
    • CommentTimeApr 21st 2011

    Mike, you can always use iTeX in Wordpress!

    So even if we don’t migrate, can we maybe still try to answer my question that started this thread:

    Is it in principle conceivable that we can migrate the source-code content of the nLab to a different wiki software? Is that technically possible?

    It seems not inconceivable – in principle – to migrate the nLab content with the math-typesetting that we are used to, to another wiki environment. If it works in Wordpress, can’t one make it also work in, whatever its called, Mediawiki or such?

    • CommentRowNumber26.
    • CommentAuthorMike Shulman
    • CommentTimeApr 21st 2011

    @Andrew: But that would only be with a custom local install of Wordpress, not with a hosted site, right?

    • CommentRowNumber27.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 21st 2011

    Mike: Yes (which partly answers Urs’ question; to get Wordpress to work with iTeX, I had to do some modification of the core. Not a lot, but I’m not convinced that I’ve done it all.). On the other hand, you can get a blog with full iTeX and MathML support right here.

    Urs: Technically, it’s possible. I’ve never said that it wasn’t. But (at the moment), MathML requires proper XHTML compliance and most software is a bit slack in that. I had to hack Wordpress to make it XHTML compliant. Fortunately for this place, Vanilla was already compliant so I didn’t have much to do to adapt this place - but even then, you may remember that I had a test site up for a while and a few very helpful people helped me to debug it.

    But first, as I said, there’s a lot of things to try. For one thing, if you want to be sure that you can point someone to a page on the nLab without worrying about whether or not the nLab is “live”, we can have a static copy that is updated once a day from the live version.

    • CommentRowNumber28.
    • CommentAuthorUrs
    • CommentTimeApr 21st 2011

    For one thing, if you want to be sure that you can point someone to a page on the nLab without worrying about whether or not the nLab is “live”, […]

    Yes. I think this is important.

    […] we can have a static copy that is updated once a day from the live version.

    Okay, that sounds good!

    • CommentRowNumber29.
    • CommentAuthorMike Shulman
    • CommentTimeApr 21st 2011

    Well, I didn’t get to choose where to host the HoTT web site. (-: I only brought it up to point out that iTex is better than other options out there.

    • CommentRowNumber30.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 21st 2011

    Mike, I guessed as much. I only brought it up to point out that if anyone else is thinking of setting up a mathematical blog and likes using iTeX then We Have the Technology.

    • CommentRowNumber31.
    • CommentAuthorTobyBartels
    • CommentTimeApr 22nd 2011

    @ Urs #25:

    Someone could write an iTeX plugin for MediaWiki; that is conceivable.

    We’d also have to change all of our markup from Instiki to MediaWiki; someone could conceivably write a script to do that.

    • CommentRowNumber32.
    • CommentAuthorEric
    • CommentTimeApr 23rd 2011

    @ Toby #31:

    I would say that is the only migration that would make sense. As long as Wikipedia exists (and is open - which is a risk if they IPO) there will be developer support for MediaWiki. Furthermore, I would say the migration is not only conceivable, but inevitable. Instiki is more than sufficient for the near to mid term so there is no rush. The key step that needs to occur before any migration can even begin being taken seriously is that this iTeX plugin for MediaWiki needs to be developed. If you were going to solicit technical help, I would probably focus on getting that plugin built. It would not only be a powerful contribution to the nLab, but also to Wikipedia and all of online mathematics.

    At some point during the next several years, Andrew just might get busy enough with other things that he no longer wishes to dedicate significant portions of time to the administration of the nLab. If the nLab were a commercial enterprise, I would describe the situation as a significant operational risk. On the other hand, MediaWiki is so widespread that it would be relatively straightforward to outsource the wiki admin if needed.

    However, no plugin = no migration.

    • CommentRowNumber33.
    • CommentAuthorzskoda
    • CommentTimeApr 23rd 2011
    • (edited Apr 23rd 2011)

    Wikipedia software has a syntax which is quite awkward in my opinion (I wrote some articles both in English and in Croatian wikipedia). Eric, LaTeX has been written in 1980s by Donald Knuth. The consistent good software is longer term than various “revolutionary” html-s which are changed all the time. Instiki is much closer to LaTeX and wikipedia does not have an aim to get easily transferrable between article LaTeX varieties and wiki mode. With permanence of LaTeX, not wikipedia but rather instiki-like software is longer term solution for us. For me it is much MORE important to be compatible with article LaTeX than with wikipedia which does not accept original material and is for research level stuff pretty sterile, and keeps only published “documented” code.

    The fact that MediWiki is widespread does not mean it is any easier to maintain. I have a friend who used several wiki softwares and for her choosing which one to take is a matter of decision. In few years, I am sure, Urs’s project will be advanced enough to get some money for support and eventually it won’t be that dependent on Andrew’s time. With public sourcecode etc. there is no need for popularity of some software to be usable. C++ is extremely popular bug-processing language. ML programming language (wikipedia: ML programming language) is a strong typing language for which the debugging time for an experienced user is about 10% of the debugging time in C++ code. My university teacher in programming languages C. Fischer said: “If you wrote a C++ code you debug it. If you write it in ML, you use it.” So you see the ease and practicality of use have nothing to do with proleteriat’s popularity. If the drunkers on the street drink unhealthy fortified wine and only few people drink herbal tea, that does not make fortified wine better. If the migration of that type happens, I will probably install local copy in instiki for me, and split off from the project. Certainly it appeals more to me than to have, for example, the display of wikipedia with ugly math symbols of unequal size; with lack of our specific and superb adaptations like floating tables of contents.

    Finally, I do not think that having a single software option in the internet is good. The different strengths make people developing parallel systems envy (in positive sense) each other’s capabilities and improve them. It is also good to have our community recognizable and independent “authority” (from wikipedia say). It is like asking that Bourbaki changes the system of paragraph numeration and sectioning to be somehow compatible with Mathematics Encyclopaedia or its layout with Springer’s LNM series. I like to have personalities of books distinguished. For uniformity there is army service, I had it for 12 months and it was enough. Who wants it, most countries accept volunteers. (I am extremizing the metaphore, just to make it visible).

    P.S. On the other hand, having iTeX plugin for MediaWiki is welcome, but I do not think it is a wise thing for scarce community of instiki users to spend time with. We have enough of our own needs and problems (e.g. backlink support between nLab and nForum, that is search for links in nLab to a specific site in nLab including the aliases).

    • CommentRowNumber34.
    • CommentAuthorEric
    • CommentTimeApr 23rd 2011

    zoran said:

    Instiki is much closer to LaTeX and wikipedia does not have an aim to get easily transferrable between article LaTeX varieties and wiki mode. With permanence of LaTeX, not wikipedia but rather instiki-like software is longer term solution for us.

    I suspect that we probably agree in principle, but there are some semantic issues.

    First, Instiki on its own has absolutely nothing to do with LaTeX. Instiki is simply a wiki software (similar to Mediawiki) that is intended to be lightweight and easy to install and use.

    To get the nice mathematics to appear on an Instiki wiki, you need the ultra cool itex2mml plugin that Jacques developed.

    All the praise you are bestowing on Instiki is actually praise that should be bestowed on the itex2mml plugin.

    It is the itex2mml that provides the near perfect (if not perfect) compatibility among the nCafe, the nLab, and the nForum. It is the itex2mml that is now available for Wordpress that is seamlessly used as a backup when the nCafe is down for some reason or another.

    If someone were to develop an itex2mml plugin for Mediawiki, then the mathematics rendering we all love so much would also be available on Mediawiki in exactly the same form we all love. In other words, the plugin for Mediawiki would provide near perfect (if not perfect) compatibility between the Instiki version of nLab, a Mediawiki version of nLab, the nCafe, a Wordpress version of the nCafe, and the nForum. They would all be 100% compatible in terms of the syntax and mathematics rendering. The only challenge would be to handle some of the non-mathematics functionality.

    itex2mml is an excellent short, medium, and long terms solution. I don’t see any reason to ever consider migrating away from itex. It is perfectly suited for anything we want to do. The only thing that is obvious is that the Instiki wiki software is not a viable long term solution (although, as I said, it is a fine short to medium term solution). A basic requirement before you should consider a migration is that whatever you migrate to should support itex2mml. Currently, Mediawiki does not, so currently, there is no reason to even consider Mediawiki as an alternative to Instiki. Only after an itex2mml plugin for Mediawiki becomes available will there be any viable alternative so we should all be happy sticking with Instiki until that day arrives.

    • CommentRowNumber35.
    • CommentAuthorUrs
    • CommentTimeApr 23rd 2011
    • (edited Apr 23rd 2011)

    Instiki is simply a wiki software (similar to Mediawiki) that is intended to be lightweight and easy to install and use.

    I was wondering about this before: if one looks around the web for “Instiki”, one sees that it is all avertized as being the wiki software that is

    1. easy to install

    2. lightweight.

    This is not quite the relevant selection criterion for the wiki software we really need and plausibly the source of all our troubles.

    Googling for “Instiki scaling” produces very little (for instance this), and nothing indicating that it is supposed to scale well.

    • CommentRowNumber36.
    • CommentAuthorzskoda
    • CommentTimeApr 23rd 2011

    If someone were to develop an itex2mml plugin for Mediawiki, then the mathematics rendering we all love so much would also be available on Mediawiki in exactly the same form we all love

    Eric, thanks you for explaining some things which I was kind of aware but did not really analyse the proper way. But you see, you are talking about importing our code into MediaWiki. But MediaWiki has also some other ways of doing math and so on. So to get back to LaTeX we will need to simultaneously allow also other ways. And other users which you seem to target by wanting to take the publicly popular way will use it right away. This would make the code more uneven, less typed, harder to back translate. Finally, the concern is will other wikis be equally handy for local light offline install. The problem with restart will certainly be solved relatively short term. Long term I did not hear still any advantage FOR US. I was arguing that being the same as wikipedia etc. is not really a service to the world. Slight diversity will just push different systems to evolve…

    • CommentRowNumber37.
    • CommentAuthorEric
    • CommentTimeApr 23rd 2011

    But MediaWiki has also some other ways of doing math

    That is a good point, but easy to deal with. If there ever is a Mediawiki version of the nLab (and there may never be), the way to deal with multiple maths is to just turn off everything except itex. Since the nLab is for nPeople, we do not need to accommodate existing Wikipedia material. Everything would be directly ported with itex and any new material would be done with itex as usual.

    I could be wrong, but if the day ever comes where you need to outsource the nLab server administration, e.g. Andrew wins the lottery and resigns, I imagine it would be much easier to find someone who could administer Mediawiki than Instiki.

    I’m tempted to write an article on the “economics of mathematics” because you really do not have the luxury of resources to accommodate the perfect environment and will need to make compromises. You (a general “you” to everyone reading this) are probably not as aware as you should be of how lucky you are to have someone like Andrew around. Without him, this place would crash to the ground in a heartbeat.

    PS: From what I understand from previous discussions with Jacques, it is a fairly massive project to clean up Mediawiki enough so that it can serve clean XHTML. The chances of it happening are very small unless there was a giant push from a lot of people, but I don’t get the feeling that will ever happen. So this conversation is probably moot anyway.

    • CommentRowNumber38.
    • CommentAuthorUrs
    • CommentTimeApr 23rd 2011

    Is that the decisive feature of instiki, that it is properly XHTML compliant? Is there any other wiki software that is? The discussion sounds as if Mediawiki is the only alternative. There must be more, no?

    • CommentRowNumber39.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 23rd 2011

    To have MathML, you need full XHTML compliance - at least at present; when HTML5 comes along then it should be easier to mix stuff in (not that I regard that as a necessarily a Good Thing: being compliant with some standard is naturally to be desired). So MediaWiki would need adapting to be compliant and I have no idea how easy that would be. One would need to adapt the itex stuff to a suitable plugin. The tricky part of that is actually already done since MediaWiki uses PHP as does this forum, so a mediawiki plugin would be an adaptation of this forum’s stuff.

    However, the issue is not the feasibility of the migration, but rather the rest of the stuff. Wikipedia doesn’t scale because it’s based on MediaWiki, it scales because they have a vast array of servers with load balancing and a load of other stuff. I’m sure it’s possible to have a MediaWiki installation that is a load of rubbish, just as it is possible to have an Instiki installation that is not.

    The decisive feature of Instiki is that the version that we use was developed precisely for the situation that we are using it for: putting serious mathematical content on the web. All the bits of Instiki that we actually like would have to be replicated in MediaWiki because MediaWiki is not designed for this purpose. The same goes for just about every other wiki software out there. The fact that Jacques chose Instiki means that it is right for us. One thing we would have to content with in MediaWiki may be that we would have a load of additional stuff that we don’t want and never use, but which is built in to the core and isn’t easy to get rid of.

    I don’t know about all the issues with scaling, but from what I’ve read then there’s a lot of misinformation out there. Ruby on Rails is relatively young so a lot of the issues have not been definitively resolved, but also a lot of issues are actually non-issues. An example of a RoR application that seems to be scaling fairly well is Twitter.

    (I agree with Eric that there is a lot of confusion in this thread between the different pieces of the puzzle.)

    • CommentRowNumber40.
    • CommentAuthorUrs
    • CommentTimeApr 23rd 2011
    • (edited Apr 23rd 2011)

    Wikipedia doesn’t scale because it’s based on MediaWiki, it scales because they have a vast array of servers with load balancing and a load of other stuff.

    Is that really relevant to the scaling problems that we have?

    a page that is getting a little longer takes minutes to save on the nnLab, or at some point fails to save entirely. If I save the same size of document on my web-book, even if I send it around by email, it takes an unnoticeable amount of time. So if this fails to work on our present server, I think it is a problem with the software, and no size of array of servers will help with it.

    I mean, the nnLab has grown a little larger than a personal notepad, but it is still tiny. The number of users is tiny. The number of page requests is probably tiny. So if we have scaling problems, it is not the kind of scaling that Wikipedia has to deal with. i’d think.

    • CommentRowNumber41.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 23rd 2011

    But the pages that take a long time to save are those with an enormous number of wikilinks in them. When you save such a page on your web-book, it doesn’t have to cross-check each of those links against 5000 entries in a database. Similarly when you send it by email. I would be amazed if doing the same on MediaWiki was any faster.

    That said, as I’ve said before, there may be ways to speed that up. But that’s something you should take up with Jacques, not me.

    When you talk about “scaling”, the usual meaning of that is number of users and pages, not the lengths of individual pages!

    • CommentRowNumber42.
    • CommentAuthorzskoda
    • CommentTimeApr 23rd 2011
    • (edited Apr 23rd 2011)

    41 Andrew and 40 Urs

    The checking of links etc. after saving is needed, in my understanding, just in order to regenerate the page for display. 5000 entries is not 5000 but much more regarding the huge number of page aliases.

    1. I could imagine, that usually the difference between the old page and new page is just one paragraph, so why not using this fact when checking for material. I could imagine that if I work on a large page and do it incrementally, that the system checks again just for the changed links. Of course, the rest of the system could have changed in the meantime, so it may appear incorrect to do that, but I think this is sometimes happening with the current system as well. I change a page by adding a redirect or something and it happens occasionally that another page with the link to the now changed location does not appear correct with the reloading and it happened to me even after changing it (but such incorrect behaviour gets corrected very soon when it happens).

    2. I could imagine an easier hack, where one introduces the fast work option, so that when saving in, automatically all nLab links will be presented as existing in immediate reloading, without checking database. (I could imagine even more radical option to save it without loading it back as a page, just getting a confirmation that the page is received, what is useful when running to a bus, but this is less useful in general.) For incremental work, one could do the save with usual (checked links) redisplay option just once per 10 or 50 updates, depending on working style.

    In any case, checking every time everything when doing incremental work may be suboptimal.

    • CommentRowNumber43.
    • CommentAuthorAndrew Stacey
    • CommentTimeApr 23rd 2011

    Zoran, there are lots of possibilities. One thing might be that when instiki checks a link against a page, then it does a read on the database. So each link involves a new database query, rather than loading in a list of all the pages in one go and saving it in memory. I can imagine that if there are only a few links then doing a db query for each is quicker than loading in the whole lot, but at some point this tips and it becomes a drag.

    The point is I don’t know! I haven’t spent too much time under the bonnet (aka hood) of Instiki. So I’m the wrong person to ask about this. With Instiki, we know exactly who the right person is: Jacques. We also know that he takes us seriously. With MediaWiki, we wouldn’t know who to ask, we’d have to do all this sort of thing “in house”. Now, while I feel happy hacking the forum software, that’s not on the front line. I feel much more nervous hacking the wiki software - I’d rather leave that to the expert(s).

    So I suggest that we send a “feature request” to Jacques, outlining the problem that saving pages with a large number of wikilinks takes an inordinate length of time and asking if he has any ideas on how to speed up the process a little. I’d far rather do that than launch in to the murky waters of migration.

    • CommentRowNumber44.
    • CommentAuthorzskoda
    • CommentTimeApr 23rd 2011

    Before asking him, are we completely sure that the number of wikilinks is the principal reason for slowness of reloading large cohesive topos-like pages ? Maybe we need to do few tests before.

    • CommentRowNumber45.
    • CommentAuthorMike Shulman
    • CommentTimeApr 24th 2011

    There are definitely other suboptimal things about mediawiki than its math syntax. I recall when we were discussing adding redirects, some of us wanted to do it the mediawiki way, where the redirectING page says where to redirect itself to, but Jacques said no, that’s braindead, you want the target page to specify what names redirect to it. After using the redirects in Instiki that Jacques implemented for a while, I have long since been converted 100% to his point of view, and I would regard it as a great step backwards to have to do redirects “the mediawiki way.”

    On the other hand, mediawiki does have some other nice features that are lacking (so far) in Instiki, like a “template language” which has (IMHO) a horrendous syntax but nevertheless enables you to do some quite nice things, and also a bundle of “semantic wiki” extensions that also let you do some very nice things.

    • CommentRowNumber46.
    • CommentAuthorUrs
    • CommentTimeApr 25th 2011
    • (edited Apr 25th 2011)

    Okay, so I suppose we have now answered my question:

    What are our perspectives and options? Is it in principle conceivable that we can migrate the source-code content of the nLab to a different wiki software? Is that technically possible?

    by:

    it might be technically possible to some extent, but the expected trouble and disadvantages outweigh the expected advantages.

    Instead, the perspective is that we somehow need to find somebody who fixes the existing software (namely two main bugs: 1. the Lab crashes frequently and 2. hyperlinks are handled inefficiently in a way that makes larger pages – such as are currently used by one of the present authors and such as are expected to appear frequently in any nnJournal application – unfeasible).

    The person doing such a fix might be

    1. Andrew Stacey – though the task seems to be nontrial enough that the problem requires considerably more time and energy than Andrew can be expected to spare easily;

    2. Jacques Distler – if we ask him;

    3. or maybe some professional programmer who we could hire and pay for this job.

    Is that right? What about the third option? Does this sound reasonable? I’d be willing to pay for this, if that’s what it would take to have the job done soon.

    • CommentRowNumber47.
    • CommentAuthorTobyBartels
    • CommentTimeApr 25th 2011

    A brief aside; if one wants to see the means presently available in MediaWiki for math rendering, here is where you look. This seems to be moot, but maybe somebody was wondering.