Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
Thank you, I have now fixed this and a couple of other corner cases.
In the article Beck module, when I click on the hyperlinked Kähler differential, I am taken to the broken link
https://ncatlab.org/nlab/show/K%C3%83%C2%A4hler%20differential
Somehow it passes through the correct encoding (Kähler), but then this encoding gets distorted by a further redirect.
Thank you, this was the same issue that Jesus raised in #94. I have now, I believe, fixed this, but it may take 24 hours or so before the fix is noticeable, due to caching.
In Yoneda embedding there are broken links to images uploaded to a nLab personal space. Images uploaded via the standard method do work, as in axiom of choice at the end.
Just to break the silence, though I don’t know anything more than anyone else here, as it’s all in Richard’s hands. From a quick email exchange with Richard, he is optimistic.
I didn’t know migration would be this way, but then I don’t know anything about it in the first place. Evidently the Lab is de facto going through a hibernation now. Maybe it’s a chance to restart afresh once we emerge into a new spring, with an nLab 2.0 (or maybe we are only now crawling out of beta phase).
On that note, I’ll share that I had agreed, weeks ago, to present the Lab at the virtual meeting Big Data in Pure Mathematics 2022 (here), taking place May 21-22. Richard says by email that the Lab will be operational long before that.
With Spring in the air, I’m looking forward to the end of hibernation. I miss the activity around here.
Apologies for the lack of updates, this is just because I have been focusing on the development work. I am working as much as I can on it (usually late on night; I don’t think I’ve slept more than four hours on any weekday this year).
Editing is now working. The actual renderer has been ready for some weeks (it lacks a couple of features, but it can render many pages); I have since been working on the actual editing process (GUI, history, nForum interaction, security aspects (HTML sanitisation, CORS, …), etc). I have finally finished this for the moment. I will begin very soon, probably this evening European time, trying to upload a tranche of pages rendered with the new editor and editable.
The two main missing features in the renderer are Tikz/xymatrix diagrams (this requires a slightly different infrastructure component) and includes. My next priority will be to put these in place, but we can upload pages for editing before that, as many do not need these features. Context menus no longer rely on the include mechanism.
[Maybe I should emphasise in case anybody is not aware that I am completely rewriting the software and infrastructure, in such a way that we hopefully have something maintainable, performant, and economical for years to come. When I say completely, I really mean completely, there is not a shred of the original Instiki left except for the broad styling, and the code from my major ’nLab 1.5’ changes around 2018 (if I recall the year correctly) is also essentially gone.]
Actually, I would like to revisit one of my earlier suggestions, and ask people to post here whenever they would like to edit a page. I will then manually render it. I realise that people are keen to get back to normal, but this approach will allow me to catch generic issues. Once we have reached a critical mass in terms of diversity of pages, we can render everything.
For now Tikz diagrams cannot be rendered, so no point suggesting those, but anything else should be fine (includes can quickly be supported, so feel free to suggest those).
I will typically be able to act upon suggestions within 24 hours. Feel free to suggest a list of changes.
Hi Richard, sounds good. Last time you asked I suggested that I’d be happy to edit
There is a few pages around this which I’d like to edit, too, such as
Then I had some updates to make on
also
Last not least, there is a certain urgency that I make some updates on the page titled
If you could make contents editable, that would be great. There are some links that need to be updated in there.
The links from the personal sub-webs to the main nLab (i.e. http://128.2.67.219/nlab/show/differential+cohomology linked from Urs Schreiber’s personal sub-web at http://128.2.67.219/schreiber/show/differential+cohomology+in+a+cohesive+topos) redirect to the Amazon AWS url https://nlab-pages.s3.us-east-2.amazonaws.com/nlab/show/differential+cohomology rather than the ncatlab url https://ncatlab.org/nlab/show/differential+cohomology
Apologies for the delay, I was not able to work on this for a few days, and wished to change something, which I have now done (to use katex server-side for LaTeX rendering). Thank you very much for the suggestions, I should be able to begin rendering those of these which do not contain Tikz/xymatrix diagrams now within a couple of days at most (the software is ready, it’s just a question of when I will have the opportunity).
tl;dr: could you perhaps add a note to the HowTo indicating that regular editing is currently not possible, and what one has to do instead (or perhaps just a link to this forum post, or something like that).
Just to sketch the experience I just had:
I’ve never edited a page on ncatlab before, but just came across a page where I noticed a mistake, namely perfect group (off-topic, but if you want to know what: it defines a group as being perfect when it is isomorphic to its derived subgroup, which is not the usual definition and not equivalent to “has trivial abelianization”. Counterexample: the derived subgroup of a free group of countable rank).
Since I noticed a link “Edit this sidebar” (which as I later discovered leads to an error), I thought that perhaps this is a Wiki I can edit. After some searching I found this HowTo and then spent 10 minutes trying various browsers and things to get the edit link that this HowTo promises, to no avail.
After some more digging I found this forum and then this forum post, and now I suspect the issues I run into are related.
I am pretty sure most people would have given up long before (and perhaps that’s also OK with you, dunno). I understand that you are working on rewriting this website, which seems like a major endeavour to me, and it seems this is done by a volunteer (volunteers?) and I have the greatest respect for that – so to be clear, I do not wish to complain, just suggest that if perhaps the website could be updated to inform visitors about this, that would be super helpful, and hopefully not too much work? At least the HowTo page could perhaps be edited to state that it is outdated. Even better would be of course if every page could state that the website is undergoing a major reworking (though I understand this may not be possible for technical reasons)
Thank you for your hard work!
Best wishes Max Horn
Just to apologise for the lack of updates/progress; for the last few days I have been and still currently am ill, and for a couple of weeks prior to falling ill was completely tied up.
In response to #115, thank you, but unfortunately what I lack is time, and it is alas not possible to do anything to free up more. The History link will come with the making live of the editing functionality, which is actually ready and working, it just lacks some GUI links.
Regarding #116, I’m very sorry for your experience, but unfortunately I am completely focusing what little time I have into getting the editing, etc, functional again; though there are many small things such as those you suggest which could help smooth things over, they all take time, and unfortunately I feel that it is not worth spending time on it compared to the urgency of getting editing functional again. I realise that this will sound like an inadequate response, but I hope you will bear with us and return to edit once things are back to normal.
Richard,
sorry to hear that you keep being sick.
Here is a thought: We could remove pressure both from you as well as from the nLab community by falling back to the installation we used to have, for the time being?
This would allow you to take all the time you need to work on your plans regarding software development, and would allow us to escape the somewhat awkward situation of being associated with a website that is in limbo for months.
Our HoTT sub-web has been alive and active all along. Therefore the same is surely possible also for the main web(?)
Yes, it’s not been a great year health-wise so far, not usual for me!
Unfortunately, to go back to the very first comment in the thread, the reason that I began the migration with immediate effect around the New Year was that the CMU server can no longer cope with the editing of the main nLab. The HOTT wiki has trivial ’density’ (amount of inter-linking and amount of content) compared to the main nLab.
I found some time today and have made some progess. In the long run this lack of editing for a few months should be fairly trivial, frustrating though it is at present.
As I have written a number of times, the goal with the migration is that we should never end up in this situation again; the new software and infrastructure will be scalable, and will be much simpler and more modular, and people other than myself should easily be able to maintain it and develop it. It is only during this initial effort where I don’t feel there is anybody else who can do the job, unfortunately, otherwise I wouldn’t be doing it.
Richard,
sure, the installation needs a boost, but why does it need to be switched off in the meantime? The HoTT web still being active gives the impression that functionality need not be switched off while being overhauled.
Shutting down the nLab made intuitive sense to me in December, for the process of actually swapping the installation. But for the long-term procesess of software (re-)development, under strained time and health conditions all the more, one would expect it could take place in parallel of a running installation, suboptimal as the latter may be.
This is, in any case, the perception available to me, here. You’ll know better. But I thought if you struggle with time and health issues, then it could be a relief to you if you’d just let the old system run, thus completely freeing yourself from all time pressure on the software redevelopment. We would, I can say, rather cope with a sluggish uptime than with a prolonged downtime.
FYI now it seems that while the HoTT wiki is still being edited, the HoTT wiki no longer automatically sends edit announcements to the nForum.
Yes, I have disabled it for the moment, as no announcements with useful content were actually being made, and there are few who are following the development of the HoTT wiki, so that they were essentially spam for most people. I caution that I may not be willing to support the HoTT wiki after the migration, and thus if people wish to keep it going, somebody else may need to be found to do that.
Re #121: the situation is that the actual saunders machine becomes overwhelmed if editing is permitted on the full nLab, and goes down.
I am by the way continuing my work; I have decided to implement Tikz and xypic rendering functionality, searching, and various other things before allowing people to edit, so that most things should be in place. I am nearly finished with this.
somebody else may need to be found
Thanks for saying. I’ll go start advertising. Unless a potential volunteer is reading here and speaks up?
It might be better to fold the HoTT wiki content into the main nLab.
Re #125, I’ve thought that for a long time.
Re #127: the Saunders machine is not being used for the main nLab anymore; I was just explaining why we cannot move back to the Saunders machine even temporarily, in response to Urs’ enquiry as to whether we could do that. It is still being used for all other ’wikis’ running on the old Instiki installation; this is just for convenience, to allow people to edit them whilst I am completing the writing of the software for the cloud-hosted nLab (editing is currently disabled). A couple of us have daily backups of the database, from which it will be possible to move the content to the cloud should the machine die. But, yes, please do not order any decommissioning for the moment; I have direct contact with Joe, though, and I believe that we have an understanding not to switch it off without notice. As I have mentioned to others on numerous occasions, the machine seems fine operationally (it is just that it could no longer cope with the old Instiki software, mostly due to the nature of that software); of course some hardware component could just blow out, but assuming that doesn’t happen, it should tick along happily for the foreseeable future I expect. I will contact Joe when everything is finally migrated.
Regarding #124: just to emphasise that this was specifically for the HoTT wiki, not the main nLab. I’d politely ask to be kept informed if you do make any plans, rather than to jump to any decisions; I think first people should decide on whether the HoTT wiki should exist at all any more (cf. #125 and #126, which I think I probably agree with too, though I don’t feel strongly about it), and if so, after that a good arrangement software/infrastructure-wise should be discussed in conjunction with myself.
What I think may be best for ’live’ personal webs that are continued is that they use the new nLab infrastructure/software which I am writing, but are a separate deployment; the new software has no notion of multiple webs within one installation, but is completely generically written, so can be used in separate deployments for personal webs. I would be willing to help with setting up that deployment, but can envisage, as I mentioned near the start of this thread, stepping back at that point and leaving it to someone else to ’maintain’ the deployment thereafter, which probably would mainly mean keeping the software in sync with the main nLab software as long as this is desired, and carrying out any manual stuff that cannot be done by users of the wiki, e.g. removal of spam. I would suggest something like this for the HoTT wiki as well if people decide to keep it.
There is a pretty strong argument for merging the HoTT wiki into the nLab.
Most of the HoTT wiki contains the same articles as the nLab wiki does (i.e. category (homotopytypetheory) and category/type-theoretic definition of category, monoid (homotopytypetheory) and monoid, partial order (homotopytypetheory) and partial order, higher inductive type (homotopytypetheory) and higher inductive type) but with much less information on the HoTT articles than the nLab articles.
Even basic concepts in HoTT are either more detailed on the main nLab than on the HoTT wiki (i.e. universe (homotopytypetheory) and universe/type of types, identity type (homotopytypetheory) and identity type, univalence axiom (homotopytypetheory) and univalence axiom), or just outright missing on the HoTT wiki (i.e. dependent product type or dependent sum type, there are no articles dependent product type (homotopytypetheory)/dependent function type (homotopytypetheory) or dependent sum type (homotopytypetheory)/dependent pair type (homotopytypetheory))
At minimum, every unique article on the HoTT wiki that is directly related to category theory should be merged into the nLab, such as precategory (homotopytypetheory) and Rezk completion (homotopytypetheory). The HoTT wiki seems to have a lot of unique articles on dagger category theory, such as dagger monomorphism in a dagger precategory (homotopytypetheory), compact closed dagger category (homotopytypetheory), semiadditive dagger category (homotopytypetheory), which should be merged into the nLab as well.
For many of the rest of the unique articles, one could probably define most of the structures outside of HoTT. For example, HoTT book real numbers (homotopytypetheory) is defined as the initial Cauchy structure (homotopytypetheory) on the HoTT wiki or the initial Cauchy-complete metric space in the HoTT book, none of which are inherent to HoTT or even to type theory.
precompact space redirects to totally bounded space, which redirects back to precompact space, in an infinite loop.
Re 129: some of the same named nLab entries in HoTT have definition/viewpoint/formalism which is specific version in new foundations mathematics, while nLab has a modern, but still usual math definition. So the two entries are entries about a different context in mind. In a possible merger one should somehow take care that this important subtlety is not confluented, resulting in a mess. So I guess some level of manual merger is needed then. Some entries could be double with different version for a different context: blabla versus blabla in HoTT.
I assume the error noted in 116 was a ’mental typo’, where the author wrote “isomorphic” instead of “equal” by accident, as it were.
@132 A thinko, as it were.
Can we have a collective discussion about the path forward for the nLab? What are the available options at the moment?
Is a re-activation of the old server a possibility? Would that effect the work already done on the migration?
To chime in, there have been many times over the past few months where I would have liked to edit an article, but was not able to because of the ongoing migration. It is much more effort to add a note about an edit to the nForum, for several reasons (not least that there is no easy way to jump to the nForum discussion associated to an article), which has deterred me from doing so. Not to mention such edits are not currently helpful to readers in the slightest. Keeping the nLab uneditable for so long seems harmful to the vision behind the nLab. Richard is obviously making a great effort to support the nLab, but it is entirely understandable that it is too much for one person in their spare time, especially compounded by illness. If it is not possible for the old server to be reactivated during the migration process, Urs’s suggestion seems eminently pragmatic. I think the dependence many people have on the nLab needs to be strongly considered in this situation.
Are you (or anyone reading here) familiar enough with the matter that you could pick, contact and sensibly instruct some IT/web-company for the task of hosting and administering the nLab?
Alternatively, I was thinking of simply renting a MediaWiki hosting service, of which there seem to be plenty. Going that route would be a huge setback regarding typesetting capabilities (and we’d need someone to write a script to port the old content), but it would free us from self-administration worries once and for all.
I understand the frustrations, but I would like to ask please for just a little more patience. I have been continuing to work hard on this, and things are really very nearly ready; LaTeX diagram rendering functionality is now complete, and all the difficult bricks are in place. There are just a few smaller things that I would like to complete before making editing live, because there will no doubt a lot of small tweaks to make once I open things up, and I wish to be able to complete the functionality aspects first. I have been aiming for roughly the end of this week (say next Monday) for beginning the opening up of editing, as I have a little more time this week due to the Easter holidays.
If people really wish to not wait that extra week or so and throw away the very large amount of work that I have already put into it, that is fine, but then please let me know immediately.
Alternatively, if people are willing to accept one or two missing pieces of functionality, I can begin to open it up already, as most things already work; but this must be on the understanding that those missing pieces are there.
I would also just ask to not please make any assumptions about my being too ill, or indeed any other assumptions about what I am doing. Regarding the illness, I am fine again now, it was nothing serious. If I am at any point unable to continue working on the nLab, I will let people know clearly and directly.
Unless I write in this thread to the contrary, it can be assumed that the migration is proceeding along expected lines, that I am working very actively on it, and that it will be ’done’ in its core phase (edtiing, etc, is live) more or less at the expected time, which was around now. I am not doing this for my own sake, I am doing it out of some sense of duty, because I think it is the best thing for the nLab and that the alternatives will not provide something as good, either in the short or long term; and because, as I have brought up before, whilst there are many people with some programming expertises, there is nobody else who has both professional software development experience, who has been involved with the nLab purely as a user for many years, and who for some unknown reason is willing to give up late night hours to work on it for some kind of greater good.
I am currently finishing off the frontend part of the search functionality (the backend part is done; the frontend part is almost done too, I’ll probably finish it this evening or tomorrow [Edit: now done, finished it this evening]). Apart from that I just need to make a couple of minor tweaks to the renderer to take into account that LaTeX diagrams can now be rendered. If I were able to focus on it continuously, it would be no more than a day’s work.
[As an aside, it should be remembered that things like this diagram functionality are completely specific to the nLab, that I wrote myself for the nLab a few years ago. A number of other aspects of the nLab’s syntax which predate my involvement in developing its software are completely specific to it. Most of the work of making a renderer is precisely handling these specificities in a good, robust, scalable way; any kind of migration to other software would involve just as much work in trying to handle those specificities, so it can probably be imagined that suggestions like ’migrating to MediaWiki’ as an alternative to the current migration-in-progress are hopelessly naive, as are suggestions that one could obtain a professional third party to ’maintain’/carry out future development of the nLab. The nLab’s funds would be used up in a week or less. The only way the nLab can be developed is by volunteers (and there have been until now very few willing to do actual software development, though hopefully this may now change with the new software, which should be much more accessible) or by a major funding undertaking that would pay part of the full-time salary of a software engineer.]
Will the new software be open source?
Yes. It will be pushed to github once editing is live and things have settled down. And I hope that the nature of it (it is much more modular, everything can be done via API (i.e. programmatically, without a browser GUI), the code is much simpler and smaller, there are essentially no infrastructural dependencies, etc) will allow people, hopefully many people, frequently chip in with little code improvements, which has never happened before.
I will say more about this at a later point, but I consider the ability to do things via API to be essential to modern software; it allows, amongst other things, the nLab to easily be built upon for other purposes, e.g. people who wish to build semantically richer functionality on top of the nLab, which there has been a fair bit of interest in, and which I think at some point in the evolution of the web is likely to become much more common than it is today. It also allows people to easily develop their own little tools for interacting with the nLab, e.g. if people don’t like the browser editor, it would be easy for them to write their own.
Yes. It will be pushed to github once editing is live and things have settled down. And I hope that the nature of it (it is much more modular, everything can be done via API (i.e. programmatically, without a browser GUI), the code is much simpler and smaller, there are essentially no infrastructural dependencies, etc) will allow people, hopefully many people, frequently chip in with little code improvements, which has never happened before.
Sounds perfect. Thank you for all your hard work!
Alternatively, if people are willing to accept one or two missing pieces of functionality, I can begin to open it up already, as most things already work; but this must be on the understanding that those missing pieces are there.
I’m fine with having one or two missing pieces of functionality such as latex diagrams not rendering correctly yet, but I would like to be able to edit the nLab.
I understand. I’m now actually done with the tweaks, mentioned in #142, that are necessary to allow the LaTeX diagram functionality to be used by the new parser. Thus in fact I’m done with everything in #142. Nevertheless, there is a small but somewhat important thing that I would still like to add before beginning to allow editing (I’ll not go into details as it is new, and I’d rather explain once people see the purpose of it); things are still on course to beginning to allow editing (for some pages) on Monday at the latest, so I’ll have to ask you to wait just a little longer.
I have now completed the functionality I mentioned in #147 (a preview functionality for editing, which was asked for numerous times for the old Instiki software, but is of more fundamental necessity for the new software, as edits will now be completely immutable; there will be no ’30 minute editing window’ as in the old software).
With that done, I am essentially ready to begin opening up editing. I am travelling tomorrow, but may begin opening up some pages tomorrow evening European time; if not, then on Monday evening. Note that there will not be a sudden switch-on of editing for all pages; allowing editing involves re-rendering the current page and all of its historical versions, and I will be proceeding through the pages in stages to try to catch bugs early and avoid having to re-render everything many times.
What do you mean by
edits will now be completely immutable
After one submits an edit, it will not be possible to change it: it will remain that way forever, stored in the system as a revision of the page. Previously, there was a 30 minutes window after submitting an edit during which one could change it.
so I tried to edit the test page here but when I submitted my edit I got this message
Edit successfully made, but an error occurred when updating the page history. Please raise this at the nForum.
Thank you for trying it out, but the test page is intended only for my own use, and will not work as one might expect in all aspects:-). I realise that I accidentally left a link to it in the nForum for a few hours, but I’ve removed it now.
I tried adding a redirect to the test page and got this message:
Redirects no longer handled in source editing
Yes, this is intended. I will explain this and all other changes from before later.
I get an access denied error when I try to visit the page totality space. The page seems to exist in some capacity, because totality spaces redirects to totality space.
A similar problem occurs with precompact space and totally bounded space, because the two pages redirect to each other. I remember that Richard Williamson and Urs Schreiber went through a list of redirects of the same term to two different pages and they might have accidentally made some errors when trying to get rid of the problem.
Just a quick update that I did not have as much time today as expected, but have carried out a few small tweaks both yesterday evening and this evening. I expect to have more time tomorrow evening, enough to allow me to begin to allow editing on some pages, as promised.
I have carried out a bit more work today, and am now ready to begin opening up editing. I will begin this in a moment. There are many changes to the software from before, and it will take a long time to go through everything; I will get to it eventually, but for now, the most important things from a user point of view are the following. I am omitting justifications for these changes for now, and just stating them.
\tableofcontents
, on its own line.\context_menu[...]
, replacing … by the name of the context menus to be included, e.g. \context_menu[category theory, knot theory]
. Moreover, context menus are no longer normal pages. I will not be opening up editing of these just yet, and will postpone more details until that time comes.[[some page]]
, there is no check made that some page
exists; it is assumed that it does, and a link will be created in all cases. There will be no creation of things like some page?
, with a ?
that can be clicked on to create a page. This is in line with the internet in general. I am not yet opening up creation of new pages (this is not for technological reasons, it is just that I’d like to ensure that editing is working reliably first); again, I will explain how to do so when the time comes for that. At a later point, I may add some client-side functionality to the editing page to try to help identify broken links before submission, but the main responsibility for not creating broken links is now on the editor.Maybe there are other things I’m forgetting now, in which case I’ll edit this post to add them. A couple of other, less fundamental, points:
.*
. If something like ’Latest revisions’ is still desired, I see this more as a meta-tool, and I will probably make it available only for those ’in the know’.Technologically the changes are vast, and I do not have time to go into the details. More or less everything has been completely rewritten. But here are a few aspects which might interest some:
ncatlab.org
domain name comes from a caching layer (called CloudFront in the AWS case). One can also access pages at https://nlab-pages.s3.us-east-2.amazonaws.com/nlab
to bypass the caching layer, but this is discouraged (it should probably only be useful for debugging/development); I have put various measures in place to ensure that Google and other search engines should only index ncatlab.org
. Amongst other things, this caching layer should ensure consistent performance across the world. DNS is handled by AWS Route 53; CloudFlare, which we were using before, is no longer involved.This being a complete rewrite, there will definitely be bugs, maybe heinous ones. Please be understanding of that; the key thing is that the core code is in place. It is much easier for me to find time to make a bug fix than to write new functionality. One can always work more on eliminating bugs before opening up, but there has been a lot of pressure for me to open up, so we’ll have to do the remaining bug fixing ’live’.
The economics of the new software look extremely good. We are not using more than about $3 per month at the moment. Again, I have made the design of the software with the economics strictly in mind as well as performance, maintainability, flexibility, etc. I expect the costs to go up a little once editing is back in full flow, but not all that much, as nLab edits are quite infrequent (usually just a few dozen per day), and most of the costs will, I expect, come from viewing pages.
Viewing of pages should be very fast. Edits will typically take some time as there is a lot going on (see above: LaTeX rendering, sanitisation, etc), but the situation should be a lot better than before. LaTeX diagrams were one source of slow edit submissions, and these will now only slow things down when they are first created, as described above.
if you see any ’500: Internal Server Error’ type messages upon editing, let me know, as I have put in place fairly tight timeouts for most of the lambas carrying out the backend work, and some of these may need to be increased to be able to handle heavier pages.
Previously the edit page would be locked if somebody else was editing the page at the same time. Is that still the case?
Good question, no, there is no longer any need for locking. The new software instead detects if this situation arises, and will reject the later of the edits with an appropriate error message/instructions for what to do. From a technological point of view, handling this is the main place where the short-lived additional state that I mentioned is needed.
As an aside, I am working on the opening up of the first pages, but am doing it in a way which can trivially be scaled up to handle many pages at a time; it looks like I may not quite make it with the first pages this evening, but should do so tomorrow.
I have now opened up the first page for editing, comprehensive factorization system (rendering it and its history with the new software in the process), and have made an edit.
The reason for choosing this page was simply because it was the last one created before editing was halted; I plan to proceed roughly in reverse order. Obviously I cannot ultimately proceed one page at a time, but once the first few pages have been handled, most of the others should be able to be as well. This page turned out to be actually quite a useful one as, though not extremely large, it has quite a lot of LaTeX and also several diagrams, as well as several other pieces of important syntax, which allowed me to refine the migration script, and also tweak the CPU allocations for the backend lambda functions to allow the edit to be carried out in a reasonable time.
If there is anything on this page that seems untoward, just let me know.
It is of course encouraging that this was the page which ultimately ’broke the old nLab software’, as can be glimpsed n the nForum thread I have linked to above, whereas it now renders without any problem, even when all of the diagrams are rendered afresh at the same time. KaTeX also seems to have done a pretty good job with the LaTeX, which is also encouraging. Since all the LaTeX is now rendered server side, the page renders at lighting speed, at least for me, on all browsers (by the way, MathML is also outputted by KaTeX and is present in the page source, so we have not lost the semantic benefits of that).
I will not have time for opening up any more pages tonight, but things should begin to go quickly towards opening everything up from here on in; the plan for now is just to continue to open pages for editing and fix any bugs along the way.
In nForum:comprehensive-factorization-system I wrote
where do the sources for the tikzcd diagrams on this page live?
The page source only has stuff like
<img src="/nlab/diagrams/show/20220421002920849955">
Good question, they live at the same URL as in the src
link, but with /show/
replaced by /source/
. In this case, this becomes the following.
https://ncatlab.org/nlab/diagrams/source/20220421002920849955
In particular, this is the same pattern as is used for actual pages and for context menus.
In case it was not clear from my long post above, one can still write source as before in the edit pane; the renderer will convert it into a link like the one you gave. One can also create diagrams directly, outside of the edit pane; this is currently only possible via API (no GUI), but I will probably add a GUI for it eventually.
When I have time (after editing is fully open), I will probably make it possible to click on the img tag in the edit pane to obtain the source.
I see that a few people are trying editing. The caching layer, or I think rather the way the browser interacts with it, sometimes takes a little time to react, it seems (the cache for the page in the caching layer itself is invalidated when the edit is made); I might be able to improve that, but if you experience this issue and wish to check for sure whether your edit has gone through, looking at the history page should always instantly show all revisions of the page. One can also use the direct AWS link to bypass the caching layer.
This has cropped up a few times in the course of the thread, and my views have not really changed; perhaps the most important thing to say, though, is that they will not just be left to die. I’d rather not get into a detailed debate about what to do with them yet; when the time comes, I will raise the matter. But those which are active (not many) will most likely be migrated to the same infrastructure as the nLab, in separate deployments of the new software.
I have now tweaked something which should I hope avoid the situation mentioned in #165 (it is indeed a question of browser caching, I think). If you have already viewed the page and see this issue, you’ll have to either wait for the cache to expire (how long depends on your browser settings, but I’d guess max 24 hours), or clear your cache.
I have also now opened up the next page in line, homotopy Kan fibration. This one went almost seamlessly, which bodes well for opening up larger batches.
I have also now opened up the next page in line, homotopy Kan fibration.
In the TOC on that page
Relation to homotopy colimit (geometric realization)
appears twice.
then the whole TOC is repeated without that line
Thank you for letting me know, I don’t see that behaviour myself; which browser are you using?
[Edit: actually I was able to trigger it now in a different way. I’ll investigate, thanks!]
It is difficult to say I’m afraid, it depends on how quickly bugs can be fixed. As quickly as possible, obviously. I hope within a week, but cannot guarantee it.
Today I have opened a group of other pages for editing; I’ll not mention which as they’re not particularly exciting, this is just to say that things are moving in the right direction. I am currently carrying out migrations in groups rather than individually; gradually those groups will become larger, and once they reach a certain size and go smoothly, things will proceed very quickly from there. Right now I am fixing a few small things that cropped up when trying to render a certain page.
I have not yet looked into #170 by the way. Typically this issue does not occur for me (though as I say I was able to reproduce it once), so hopefully it does not for most others either, but I’ll fix it soon; I just wish to come a little further with the number of pages opened up first (the issue is a client-side one, so I will not need to re-render pages after the fix is made).
This is a very small issue, but the context menu text appears a bit squished when I view it on mobile: https://i.imgur.com/Hg9NHPO.jpg.
Thank you! Yes, this one is on the TODO list :-).
Just a quick note that I am continuing rendering pages/opening them up for editing. I was intending to provide a list of those opened up so far, but have to break off.
Could somebody be so kind as to please provide me with a tikzcd version of the following code? I do not have time myself just now to think about getting it right.
\label{EquivalenceOfEffectiveEpimorphisms}
Grpd(\mathbf{H})
\underoverset
{
\underset{
(-)_0
\to
\underset{\longrightarrow}{\lim}(-)
}
{\longrightarrow}
}
{
\overset{
(-)^{\times_\bullet}
}
{\longleftarrow}
}
{\sim}
\big(
\mathbf{H}^{\Delta[1]}
\big)_{eff}
How’s the following?
I don’t understand why the label on the lower arrow won’t center itself, though.
\begin{tikzcd}
Grpd(\mathbf{H})
\arrow[r, bend right, "(-)_0 \to \underset{\longrightarrow}{\lim}(-)"']
\arrow[r, phantom, "\sim"]
&
\big(
\mathbf{H}^{\Delta[1]}
\big)_{eff}
\arrow[l, bend right, "\overset{ (-)^{\times_\bullet} }{\longleftarrow}"']
\end{tikzcd}
Thank you very much, I’ll give that a try soon!
I am currently running a script which will open up roughly the last 1000 pages that were created in the nLab for editing, thus about 1/17th - 1/18th or something of the entire nLab. There is thus a significantly increasing chance now that if somebody wishes to edit a page, they will be able to do so. There will be certainly be rendering issues; I have discovered three or four already, none very tricky to fix I think. If you see a page which is open for editing, please take a closer-than-normal look at the page to see if everything looks correctly rendered; if not, please do not edit it to fix it (because the issue will probably affect historical revisions as well), but instead please mention the offending page here, and I will try to fix the issue and then re-render the page. If everything goes fine, then feel free to go ahead and edit!
I have not yet rendered many context menus, so some of these will be missing, but don’t worry about this; they are now handled client-side, and as soon as the context menu is rendered it will appear on every relevant page, without any re-rendering of those pages being necessary.
By the way, if there are people following who are comfortable with the programming language Python, are able to set up MySQL to run locally on their machine, and are willing to put some work in to accelerate the migration, please let me know.
Whilst there will still be things to tweak, the actual rendering of edits is in pretty decent shape now I think. However, the syntax is stricter than the old one, and in addition there are a few things that KaTeX cannot handle: not many on the evidence so far, but some, most notably the command \underoverset
, which has been used a fair bit to create ’equivalence/adjoint’ type diagrams (see above for an example). I have a migration script which attempts to migrate the old syntax to the new; many of the things which cause pages not to render can be solved by just tweaking this migration script, without needing to touch the new software. What I am proposing is that others can have this migration script and an SQL dump, and try to use it on a group of pages as I am doing.
I’ll not post it publically as it is necessary that one has a certain degree of programming competence, but if you do and are willing to help, let me know, and I’ll send you the script and SQL dump. It will be necessary to coordinate here to avoid duplication of work.
This version is more faithful to the original, using straight arrows rather than bent ones. I don’t know if specifying a shift distance will cause problems with rendering; I’ve seen other tex-in-webpage systems fail to render tex spacing consistently. (I didn’t remember that you could shift arrows sideways like this when I wrote the previous version)
\begin{tikzcd}
Grpd(\mathbf{H})
\arrow[r, shift right=1.5ex, "(-)_0 \to \underset{\longrightarrow}{\lim}(-)"']
\arrow[r, phantom, "\sim"]
&
\big(
\mathbf{H}^{\Delta[1]}
\big)_{eff}
\arrow[l, shift right=1.5ex, "\overset{ (-)^{\times_\bullet} }{\longleftarrow}"']
\end{tikzcd}
I am just getting the request to send title and abstract for the invited presentation of the nLab at Big Data in Pure Mathematics 2022 (here) which I had mentioned two month ago (Feb 24. here).
I am not sure what to do.
Due to the ongoing issues, it seems that, optimistically, I would be able to present a freshly-baked partially working beta-system that I am not familiar with, which is being debugged live and about whose eventual stabilization I am in the dark. It is unclear to me whether it is a good idea to advertise this to a large audience.
Maybe one could argue that instead of being the intended advertisement to a larger audience of potential contributors, it might at least be used to fish for potential hard-core users who are willing to get involved into developing a new wiki. But then it seems that previous requests by users here to get involved in that development have all been turned down.
Or maybe I could think of the talk as a retrospective of what once was, with a hypothetical outlook on what could be. But it feels awkward.
My inclination would still be to re-instantiate the nLab as of Dec. 29 2021 (whose only issue was that saving was becoming slow) and to do software re-development and de-bugging in parallel on a non-live system, as seems reasonable and standard. Then I could present the working nLab that I am familiar with to the audience of “Big Data in Pure Mathematics 2022” with the bonus of announcing an upcoming software-overhaul, once finished and debugged.
Regarding the final paragraph, just to clarify (as I have written before) that it was not merely an issue of slowness; the whole server was repeatedly being taken down, and would have continued to have been so. One can glimpse this in #6 - #8 here. I guess that you would like the audience of the talk to be able to look up a page, try out an edit, and have all go through swimmingly. That would not have been possible on the old system any more.
To me, it seems perfectly possible to give that talk with the nLab in its current state; there is no need to focus on the software migration at all, it can just be mentioned in passing. But we can also use that deadline as a goal. Most of the pages in the 1000 I rendered yesterday look completely fine, those that do not can easily be fixed, and editing is essentially the same process as before, so it should be perfectly usable (and stable) on those pages. The main work is in migrating to the tighter newer syntax; the benefits of this are great, directly in fact with regard to the themes of that workshop.
What I would suggest is a) for everybody to come together and put their energy into helping completing that migration of pages, b) when coming across a page which has been opened up for editing, just go ahead and do it!
But then it seems that previous requests by users here to get involved in that development have all been turned down.
The type of help that has been offered, whilst very kind, has not been what is needed. I also felt that it was necessary to have a vision of where to go, which necessitated getting the software into a certain state of readiness so that it can be seen what is intended. In #180, I have now asked concretely for the precise type of help that would be extremely useful with regard to point a) above. If that help is not forthcoming, that is OK, we will make it anyway, but it would accelerate matters. I can also possibly make the source code of the new software available (I will do this eventually anyhow), but if someone is competent enough to make changes to that code, they would also be able to offer the help asked for in #180, so we can begin with that.
Anyhow, the conference is in about 3-4 weeks time. I think the nLab can be got into a good state for editing by then if we all come together and put our positive energy into getting it there. Everybody can do b): with another few thousand pages rendered, which I can do very soon once I have fixed the three to four things that I am aware of after the first 1000, that is swiftly 1/4 or 1/3 of the nLab or so, and it should then be fairly easy for people to jump in and edit. To repeat, I believe that editing itself should work stably and fine (especially in conjunction with the preview functionality, which can be used to catch issues before they are ’saved’), it is more the initial migration of the page where the remaining work is. People can also help out in the way Hurkyl did above, there will be other cases like this :-)
In regards to oidification - table, what’s happening is that when there is a line such as
| algebra|algebraic mathematical structure|structure | oidification |
on the new nLab renderer, the vertical line is being interpreted as a column separator before it is interpreted as a hyperlink alias.
See also the example table I put on test
Creating new pages isn’t allowed yet.
I have now fixed a couple of the issues that I spotted following yesterday’s migration. One was that imagefromfile
blocks had not been implemented; I have now done so. The other was that inline latex did not like line breaks. I’ve only re-rendered those pages which I noticed the issue on; there are probably others, just let me know if you spot one, and I’ll re-render it.
Next will be to fix the table issue. This is to do with the handling of precedence in mistletoe, the Markdown parser that is being used; though I am very happy with mistletoe in most respects, I have not found a way to force it to what in this case would involve parsing the page links before the rest of the table; mistletoe does have a notion of precedence, but it does not seem to cover this kind of scenario. I’ve run into this issue a couple of times at other points in developing the new renderer. I’ll find a workaround tomorrow.
Regarding creation of new pages, this is deliberate for the moment, as I’ve mentioned elsewhere in the thread; there are no technical issues with doing so (indeed, completely new pages are easier, because the syntax will have always gone entirely through the new renderer), I just wish to focus energy on getting editing in shape. Once we’re happy with the editing, it is easy to allow new pages.
There is a rendering bug on the page microflexible sheaf described here: https://nforum.ncatlab.org/discussion/13669/microflexible-sheaf/?Focus=97961#Comment_97961.
Also, I am not sure if it is the same table bug as in #183, but both tables in duality between algebra and geometry no longer display. They used to work correctly.
How are redirects handled in the new editing system?
Thank you very much Dmitri and Guest, this feedback is much appreciated and is exactly what is needed. I will be addressing these things before launching the next batch of rendering. Please do not hesitate to list other pages where you spot the same issue or other ones.
Regarding redirects, I assume you mean creation/deletion/editing of them. I will come back to this; as before, I wish to focus energies on editing for now. But I would argue (as has been my point of view for a long time, with the old software as well; I have myself followed this practice in my own edits) that there is in most cases no need for a redirect, and would discourage their use: it is, in my view at least, preferable both semantically (i.e. a future editor can more readily understand what is happening) and technologically to use the syntax [[the actual page name|whatever term you would like to use]]
rather than making a redirect from whatever term you would like to use
to the actual page name
. It is also more robust, e.g. someone might well change the page redirected to, and they cannot know all the pages which were using the redirect; the change might not be appropriate in all cases.
One needs to handle things like changes of page name, but this will be done separately in the new software; as has been noted many times, the old software did not keep track of page name changes, which was unfortunate. There are some things like plurals which one might wish automatic support for.
Anyhow, I’ll eventually put forward a few options on how to handle creation/deletion/editing of redirects and open it up for discussion, but for now I’d suggest to follow the advice of my second paragraph, which covers almost all cases.
By the way, Rod, if you are following, do you see the behaviour you mentioned in #170 in many or all pages? I did manage to reproduce on one occasion what you saw on that page, but have seen no other occurrences of it myself. If anybody else experiences this issue, please let me know. There is definitely some issue there, probably some kind of race condition, but I’ll wait a bit with fixing it if it’s only occurring in isolated cases.
I have now fixed the issue mentioned in #190, and am in the process of re-rendering all affected pages. Have made a second little improvement to the migration script in the process which will improve the display of equation labels, which sometimes was appearing too high up; basically LaTeX displays (things with $$...$$
or \[ ... \]
) should have a blank line before and after. So far the parser is not enforcing the blank line before, but I’ll probably do so in the end; anyhow, if you see an equation label too high when previewing, adding a blank line should fix it.
do you see the behaviour you mentioned in #170
I use Firefox. The doubling happened several times when I opened that page. At one point I forced a reload of that page’s cache (cntrl-shift-R) and I haven’t seen the problem since.
(I had an open tab that still contained the problem but then Windows rebooted my system overnight for an update).
Interesting, thank you for the very helpful reply. This sounds to me like both your browser and mine might at that time have been using a cached version of the javascript that is used to create the table of contents; this issue did I think occur in an older version of the script when I was developing it. Let us work on this hypothesis for the moment, as I have not been able to trigger this issue myself either since. If you or anyone else comes across this again, do please let me know. (Caches are great for the end user, and also save the nLab some money in its current implementation, but something of a nightmare for a developer!)
I think actual redirects are important and serve purposes beyond that which can be achieved by renaming links at the point of use. Here are three that occur off the top of my head:
The author of a page may not know whether the page they want to link to is called (say) “colax transformation” or “oplax transformation”, and it’s silly to require them to look it up and find out; they should just be able to write a link to either of the equivalent names and have it work.
The page name linked to (which could be a redirect) provides semantic information about what concept the author of a page intends to link to, which is separate from the text that they want to display to the user. The latter need not be the name of the page or concept being linked to at all, and forcing the page name linked to to be the actual name of the page rather than potentially a redirect loses this semantic information.
As an example of how this semantic information is used, sometimes a term that used to redirect to one page is promoted to become its own page. For instance, suppose for the sake of argument that “commutative monoid” used to redirect to “monoid”, but eventually we decided we had enough to say about commutative monoids to put on a separate page. With a redirect, when this happens, all the pages that point to “commutative monoid” will immediately correctly link to the new page rather than the old one.
I’m sure there are other reasons too that I’m not thinking of immediately.
I have now completed fixing cases of the issue raised in #190 amongst the 1000+ pages rendered so far. I’ll have a go at fixing the tables as soon as I can. I also may make some tweaks to the caching mechanism before continuing the large-scale re-rendering/opening for editing.
Re #198: I wouldn’t say what you describe in #1 is ’silly’; it is more work, certainly, and one can then discuss whether or not it is worth it. Anyhow, I’d prefer to not debate this just yet as I need to focus my energies, but let’s definitely bring your points up again when I’m ready. The only thing I’m likely to be immovable upon is that redirects should not be handled in page source and that their historical changes should be kept track of, otherwise I’ll see what the debate brings. My main point is that one can in almost all cases use an explicit link, and it is perfectly fine to do so, so creation of redirects is not a serious blocker as such.
If anybody with programming expertises would in the meantime like to implement the handling of redirects following the pattern of what I have been doing elsewhere, they are welcome to do so; I’ll open up the source in that case, and we can bring forward the discussion of how to proceed. AWS Lambdas can be written in various languages, so one need not use Python if that is a blocker; one of the AWS Lambas that I have written in is in NodeJS, for instance. Let me know if so.
On a slightly different note, something I have thought about for a long time (long before the migration) and may have mentioned once or twice before, is that I feel we should have a ’crowd go-through’ of all the nLab pages. From the early years there is quite a lot of random stuff that can probably be ’deleted’ (it can remain in the database from the old nLab software, but need not appear in the new). But also, when one goes through pages like this, like I am doing to some extent when checking how the rendering has gone, one often finds little syntax errors and things like that to fix. Thus, in addition to providing a thorough check that everything looks fine in the new software (fixing anything which does not), it would serve as a general tidy up.
If people are interested, let me know. If so, I can start a different thread for this where I post links, perhaps in batches of 100, to opened-up pages, and people could then volunteer to check such batches, or parts of them.