# Start a new discussion

## Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

## Site Tag Cloud

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

• CommentRowNumber1.
• CommentAuthorUrs
• CommentTimeJul 21st 2018

For size reasons I need to split this section off and re-!include it into geometry of physics – categories and toposes. (Hence nothing to be seen here, this is not a stand-alone entry. I am announcing it only since the system is forcing me to do so.)

1. Sorry for not having got around to addressing this yet. It is my highest priority.

• CommentRowNumber3.
• CommentAuthorUrs
• CommentTimeJul 21st 2018
• (edited Jul 21st 2018)

I’d really love if the page loading could be made faster. But in this case I needed to split the file for even more mundane reason: Believe it or not, but the edit pane’s reaction to keystrokes becomes sluggish beyond some text size (even on a good machine, as I have now…)

2. That is bizarre! I can only think there is some javascript running on each key press, which seems bizarre. I’ll look into it when I get the chance.

• CommentRowNumber5.
• CommentAuthorUrs
• CommentTimeJul 22nd 2018

I think Richard is in the process of fixing the huge rendering time issue! (e.g. try this or this)

That’s magnificent!

• CommentRowNumber6.
• CommentAuthorRichard Williamson
• CommentTimeJul 22nd 2018
• (edited Jul 23rd 2018)

Actually around this time I was driving my family home through Østerdalen, so unfortunately cannot take any credit for this! The only thing I may have contributed to was that I tested the loading time for these pages a couple of days ago, which will have generated the cache for the pages. Once the cache is there, they load quite quickly.

However, your message spurred me to try to do something! I think I may have come up with something which works in the short term for speeding up page rendering, without requiring massive changes to the codebase. Take a look at Sandbox now. It should load quite quickly (minus MathJax), regardless of which browser you are on, and regardless whether a cache already exists. Whereas if you load geometry of physics – perturbative quantum field theory, it will take a couple of minutes if the cache does not exist.

What I have done is to tweak what happens when a page is edited or created, so that the rendered content is saved in a directory on the nLab server. The rendered content was in fact already generated as part of saving a page edit or creation, it just was not stored. When the page is loaded, there is now a check whether the rendered content exists, and if it does, this will just be loaded in verbatim.

There are a couple of slight gaps for the moment: if you make a link on a page A for a page B that does not yet exist, and then create the page B, the link on page A will not be updated until you make an edit (any edit); and similarly for imports (if you import page B to page A and make a change to page B, it will not be reflected in page A until you make an edit (any edit) to page A. Also for now there is no way to manually ask for the rendered content to be generated, one has to wait until a page is edited. Soon, probably tomorrow, I can fill these gaps and generate the rendered content for all pages. After that, I’ll push to github.

In summary, for now you will not see the benefit of the new rendering unless a page has been edited after now (or a few minutes ago).

The saving of page edits still takes a long time, because I am still using the old page renderer. I will get to re-writing it eventually, and actually filling in the gaps above will help contribute to that, but for now I was looking just to make some improvement without major changes to fundamental parts of the codebase.

• CommentRowNumber7.
• CommentAuthorUrs
• CommentTimeJul 23rd 2018
• (edited Jul 23rd 2018)

Thanks for explaining!

Once the cache is there, they load quite quickly.

But that’s just what didn’t used to work: For long pages even the caching process was timing out, never producing a cached version. Anyway, it’s great that it works now.

There are a couple of slight gaps for the moment:

That’s okay. Better a little inconvenience for the author, than a fatal inconvenience for the readers.

Soon, probably tomorrow, I can fill these gaps and generate the rendered content for all pages.

Thank you so much!!

• CommentRowNumber8.
• CommentAuthorRichard Williamson
• CommentTimeJul 24th 2018
• (edited Jul 24th 2018)

I have continued working on this, and now have the ability to generate the content for all pages. There are still a few things to iron out, though, and there were some side-effects when generating content all pages which meant that I have for the moment switched off the new rendering (and reversed the side-effects). I will switch it back on once everything is in place.

• CommentRowNumber9.
• CommentAuthorUrs
• CommentTimeJul 25th 2018

Thanks, Richard. Just to alert you that currently all redirects seem to be broken.

(For instance see topos and scroll down a little.)

• CommentRowNumber10.
• CommentAuthorUrs
• CommentTimeJul 25th 2018
• (edited Jul 25th 2018)

Actually, something more complicated is going on: Some redirects do still work:

For instance in this Lemma:

1. the link to L-∞ algebra is broken (as is L-infinity algebra etc., all of which should be redirecting to L-infinity-algebra)

2. but the link to super vector spaces does work, redirecting to super vector space.

3. Thanks for the alert! My apologies. I think it’s a problem for pages which I began rendering yesterday evening. Will try to fix as soon as I can.

4. The problem seems to only occur when I render content manually; things seem fine when one saves the content rendered after saving a page. So I’ve switched the latter functionality back on. Trying to figure out why it works in one case and not the other. Basically if you see broken redirects, you should be able to fix by doing a trivial edit on the page one is redirecting to (so an edit to functor for ’functors’, etc). There are not too many affected pages, but those that are affected were ones that were created early on in the nLab, so are likely to be widely used.

5. I believe I understand why the difference occurs now; there is a slightly complex asynchronous callback that happens when one saves a page (and in some other cases), where some relevant things happen. I will be able to fix everything so that all works with the new rendering (and it may speed up page saving as well, depending on to what extent the slowness is due to the itex to MML rendering), but it will take a while; hopefully this evening I’ll be able to do it.

6. Making good progress, but not finished yet unfortunately. I’m ending up doing a rewrite of a large part of the renderer, which should make the code clearer, and should be more efficient.

• CommentRowNumber15.
• CommentAuthorUrs
• CommentTimeJul 26th 2018

Thanks, Richard! Sounds great.

Just not to forget about the redirect issue. Currently the nLab is in bad shape, in that loads of entries have broken links, and it takes re-saving both the entry that is being pointed to as well as the entry pointing to it in order to make the links re-appear. I have done a bunch now, but this will eventually need another solution.

This is not to rush you, but just not to forget.

• CommentRowNumber16.
• CommentAuthorRichard Williamson
• CommentTimeJul 26th 2018
• (edited Jul 26th 2018)

Yes, absolutely, my main motivation is exactly to fix this. I hope to be done this evening (which is typically the only time I have available). Currently I too have no way to fix this except editing affected pages (there are about 100 or 150 out of 13000 or so, but unfortunately they are quite fundamental pages). The new code will make things more modular, so that we have more control, and can more easily make fixes. My apologies for the continued inconvenience.

7. Almost ready to go live with the new renderer, which should fix the issues with redirects, etc. In particular, it seems to be handlng correctly (and instantaneously, on the examples I have tried) rendering of links, redirects, and includes. It remains to join up the new renderer (an API written in Python) with the ruby/rails code which controls the rendering, and to strip off some things from the old renderer; will address this tomorrow as soon as I get the chance, it should not take too long.

8. I will be trying to make the new functionality live over the next minutes. This may cause some disruption; apologies if so. Will update here.

9. I have been testing using a new endpoint for triggering the rendering of content manually. The new rendering logic seems to be working correctly in all cases, except that there is some slightly bizarre behaviour from the old renderer (which is still used for the moment as part of the process, it just handles less now) when it comes to category: syntax. It should be straightforward to fix this, but I’ll take a break for the moment. Once this is fixed and I’ve tested further, I’ll switch on the new renderer for page saves, and render content for all pages. Will update when I begin testing again.

• CommentRowNumber20.
• CommentAuthorUrs
• CommentTimeJul 27th 2018

Thanks a million, Richard!!

10. Thanks very much for the encouragement, it is greatly appreciated!

I have now fixed the categories: issue, and my testing now seems to indicate no problems. So I am going to switch on the new renderer for page saves, and also begin triggering rendering of content for all pages.

This is a big change to a fundamental part of the codebase, so it is highly likely there will be some gremlins. Please let me know if you see something strange. I will explain a bit more about the technical details of the change once everything is up and running.

• CommentRowNumber22.
• CommentAuthorUrs
• CommentTimeJul 27th 2018

I am enthusiastic about your work on this!

Right now saving produces 500 errors. So I’ll take a little break from editing, until you are done with the changes you are making…

11. Yes, please do. In the process of switching to the new renderer for page saves. Trying to minimise 500s, but there have indeed been a couple of short periods where they’ll appear. I’ll update as soon as things are working.

12. I need to make a small change to something which will take a few minutes, but will not be available for a short period. So for now the old renderer is still being used for page saves, and the nLab is safe to edit as normal. I will update when I begin trying again to switch to the new renderer.

• CommentRowNumber25.
• CommentAuthorUrs
• CommentTimeJul 27th 2018
• (edited Jul 27th 2018)

Hi Richard,

will go offline now. One issue I just noticed: When saving an entry after just a trivial edit (in order to force redirects) currently I keep getting this error message:

no implicit conversion of Fixnum into String

• CommentRowNumber26.
• CommentAuthorRichard Williamson
• CommentTimeJul 27th 2018
• (edited Jul 27th 2018)

Hi Urs, no problem! Sorry for that message, I had to make an error message appear so that I could debug (there was nothing in the logs)!

I have now done what I needed to do and have switched on the new page renderer for page edits/creations. It seems to be working as far as I can see. Let me know if problems are seen. Later, I will begin triggering the rendering of all content manually. One should be able to edit the nLab as usual from now on.

• CommentRowNumber27.
• CommentAuthorUrs
• CommentTimeJul 27th 2018
• (edited Jul 27th 2018)

oh, dear, I just see that, at the moment, the entry geometry of physics – categories and toposes is pretty broken, only the middle 3 (!?) of 7 include-file chapters are actually included.

I understand you are in the middle of making changes, this just since you asked to report issues that arise.

13. Hi Urs, thanks for reporting this. It should be fine now, I think? In general, there will be some missing links until I render all the pages, but there was also a bug in this case, which I’ve now hopefully fixed, so very good you mentioned it!

• CommentRowNumber29.
• CommentAuthorUrs
• CommentTimeJul 28th 2018
• (edited Jul 28th 2018)

Hi Richard, thanks for looking into it! The content seems to be back, thanks. But let me see, most of the \ref-s are broken (see the first two big tables right at the beginning, all their \ref-s come out unrendered). And the table of contents comes out empty.

14. Thanks for the update! I believe the broken refs may be to do with the 500 error from the other page, let’s see once that is fixed. The table of contents I’ll look into, it may be the same problem.

15. Now that the 500 error on the other page is fixed, I think things are OK now? At least the references seem fixed. I’m not sure exactly which table of contents you were referring to, but the rendering looks correct from what I could see at a glance; just let me know if there are still issues.

• CommentRowNumber32.
• CommentAuthorUrs
• CommentTimeJul 28th 2018
• (edited Jul 28th 2018)

Thanks!! Everything is in place now.

Could you remind me about the strategy on the remaining broken redirect links? Since you said it affects about 100 pages, and since we have already dealt with a few of these, maybe I should just keep doing that resaving by hand, until all of them are dealt with.

16. Hi Urs, am looking into this. Basically redirects should be working, but what I guess is that what you see is a consequence of the thing that I mentioned about increased strictness of validation. The originally affected pages should now be fixed, I think, but there will still be issues wherever there is a ’multiple redirect’. The only way to fix it is to edit to avoid the duplicated redirects.

17. But let me look into it further.

• CommentRowNumber35.
• CommentAuthorRichard Williamson
• CommentTimeJul 28th 2018
• (edited Jul 28th 2018)

Yes, the diagnosis in #33 was correct. The confusing thing was that in addressing the issue that was causing the 500 error that you mentioned in the other thread, I introduced a new issue which was preventing correct rendering, and in particular prevented the correct error message appearing when you edited small category. This should now be fixed, and if one tries to save small category, you should get an error message of the kind I mentioned about duplicate redirects.

• CommentRowNumber36.
• CommentAuthorUrs
• CommentTimeJul 28th 2018
• (edited Jul 28th 2018)

Okay, I am looking at it, but I don’t yet follow the last thing you said.

There is still plenty of broken redirects. I am pointing to the “geometry of physics” pages only because there lots of them are seen easily by just scrolling through (or searching for “?”, e.g. in perturbative quantum field theory) but I see them also elsewhere.

But I gather you are still changing things in the background. For at categories and toposes also the former problem is back: the \ref-s do not render and the table of contents does not appear (the word “Contents” does appear, but then after that the text starts immediately, with its first headline).

Just reporting. Not rushing you. Thanks for your immense work! The speedup is great!

18. Will continue working on this. Some issues with getting the geometry of physics pages in a stable state due to their size (once we get in a stable state, things should be fine thereafter).

• CommentRowNumber38.
• CommentAuthorUrs
• CommentTimeJul 28th 2018

Okay, thanks!!

19. Have been continuing working. Almost have what should be a fix for the table and contents and references completed, but could not quite finish this evening. Will complete tomorrow.

20. I think things are basically working now. Table of contents and references within pages now seem to work in conjunction with includes. I have triggered a rendering of all content, which should fix all redirects, etc.

One issue I encountered was that I could not increase the read timeout beyond 100s, because Cloudflare is hard-coded to this value. This was problematic for triggering the rendering of content/saving of edits, which in the first instance follows a tree through the wiki according to includes, redirects, etc. I have got around it by some trickery (basically I return 200 immediately and store the requests in a queue) for the manual triggering of content rendering; but such trickery for saving of edits, while perhaps not bad from a design point of view, would require a bit of a work which I do not have time for this evening. I hope that once all content is rendered, saving of edits should be quick enough to be under the timeout in all cases.

The only issue I know about for the moment is the table at Lie algebra. Will look into that when I get the chance.

I will let the dust settle and see whether there are reports of further issues. If not, I’ll push some time over the next few days to github and make a summary announcement in a different thread of the changes that have been made.

• CommentRowNumber41.
• CommentAuthorRichard Williamson
• CommentTimeJul 29th 2018
• (edited Jul 29th 2018)

I should say that the rendering of all content will take several hours; only after that will hopefully redirects and links be fully working.

• CommentRowNumber42.
• CommentAuthorUrs
• CommentTimeJul 30th 2018
• (edited Jul 30th 2018)

Hi Richard,

thanks for your work! I gather it’s tricky to deal with the existing code. Please bear with me, as I describe various issues that I see this morning. Please let this not discourage you! It’s all I can do at the moment to help.

At the moment, on my end, the “geometry of physics”-entries still show broken links and unrendered \ref-s. I see that the \ref-s on the first few tables do render now, but a little further below there appear many of them (one example: screenshot).

I tried to see if re-saving the entry may help, but when trying to save it I get the error message: “An unexpected error occurred when rendering the page” (screenshot)

The chapter “categories and toposes” also gets confused when a second “floating table of contents” appears at the beginning of the include page for “homotopy types”: The “Context”-header of that table has become part of the table of contents and mixes up the actual chapter headline (screenshot)

I tried removing the extra floating TOC, but saving the entry gives me the error message “You must use an HTTP post” (screenshot)

Then I tried some smaller entries. At topos there are broken links. One of them is to local homeomorphism. Trying to re-save that entry I get an XML parsing error (screenshot).

• CommentRowNumber43.
• CommentAuthorRichard Williamson
• CommentTimeJul 30th 2018
• (edited Jul 30th 2018)

There were still significant issues. I think that these now may be solved, but it needs another round of rendering everything, which I have begun. It may take most of the day before it’s completed, because I need to trigger batches of pages manually. The one about the second TOC I was aware of, thanks, I am treating that as minor for now :-).

Apologies for continued disruption. For reasons I will explain once things are stable, the logic is quite delicate.

Apologies also for infrequent updates here. I am for the moment just focusing on trying to get the nLab in good fundamental shape. Once that is done, I will reply in more detail to the bug reports.

• CommentRowNumber44.
• CommentAuthorUrs
• CommentTimeJul 30th 2018

Thanks a million Richard! You deserve a special medal for this.

• CommentRowNumber45.
• CommentAuthorRichard Williamson
• CommentTimeJul 30th 2018
• (edited Jul 30th 2018)

Thank you for your patience! Still a lot of pages left to render, but this time, finally, I am optimistic that things are looking as they should (up to some minor issues like the Context in TOC).

• CommentRowNumber46.
• CommentAuthorUrs
• CommentTimeJul 30th 2018

Thanks, Richard. Let me know when I should check.

21. Will do. It is going to be a while, probably some time this evening. But much of the nLab should begin to look and function more or less as normal in the meantime.

• CommentRowNumber48.
• CommentAuthorRichard Williamson
• CommentTimeJul 30th 2018
• (edited Jul 30th 2018)

Things did indeed to go well. I did not go through with the complete rendering, though. I fixed a couple further smaller things first, and added a little more functionality (can now render file uploads correctly, and links to external webs). I also tweaked things so that I can trigger rendering of all content without needing to trigger it manually in batches. So I’ll now let it run through the night, with a completely clean slate (I have wiped all previously generated content). Will update tomorrow.

[Edit: actually, looks like something is causing small issues still. Will fix tomorrow. There may be trouble editing pages until then, my apologies.]

• CommentRowNumber49.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Hi Richard, what might cause the \ref-s not to render? I tried again to resave the entries, in the hope it might fix them, but saving still produces error messages.

22. Hi Urs, the nLab was still not in a stable state. (When I said things were going well, I meant that the rendering was working mostly as expected, not that we were completely done :-). I will make a clear announcement when I think the nLab is stable.

I have now fixed the issues that I mentioned in #48, and am testing again. If all looks good, we can try once more with a re-rendering of all content.

23. Editing should hopefully be stable now, by the way. I am currently testing on the geometry of physics, so perhaps do not edit those until I update, if that’s OK?

• CommentRowNumber52.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Sure! Thanks!

24. Hi Urs, I see you’re editing at the moment :-). If you see this, just make sure you save your work before submitting, because until more of the content is rendered, it is possible that you might get a 504 (timeout). If enough content has been rendered, there should be no problem, so you can certainly try submitting.

• CommentRowNumber54.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Hi, okay, yes, I was trying to make some edits elsewhere. But all attempts to save produce error messages. Not timeouts, though, but long messages about ill-formed code.

• CommentRowNumber55.
• CommentAuthorRichard Williamson
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

I see the error message that occurred when trying to save group extension. Actually, the renderer is behaving as intended here. There is an error in one of the links to an nLab page on this page, and the renderer is correctly detecting this. I could fix it myself, but maybe it will be of use if you see whether you can understand from the error message where the problem is; because there are going to be more pages with this problem, and it will be good if more people than me understand what to do! It is maybe a bit unfortunate that the error message is so long, but it is basically showing you everything that is inside the link, i.e. it is so big because the link has not been closed correctly.

• CommentRowNumber56.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Ah, got it! And fixed it now.

Thanks for your hint. Indeed, first I didn’t understand at all what the error message was trying to say.

25. Another example, of a slightly different kind, where the renderer is correctly picking up an issue is at theory, if someone would like to take a look and see if they understand how to fix it. In this case, the renderer expects sub-headings to proceed incrementally, i.e. ### should come after ##, it will protest if #### comes after.

26. Re #56: great! Once things settle down, we can try to find a way to make the error messages clearer.

In general, I think things are looking excellent now. The renderer is chugging away at rendering all content, and everything is proceeding as expected, i.e. most things are rendering correctly, but there is the occasional issue with the fact that the new renderer is slightly stricter in its validation, which is a bit painful for now, but will be beneficial in the long run, I feel.

The only problem I see is with the references at geometry of physics, and with some tables. I will look into that this evening, probably.

• CommentRowNumber59.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Thanks, I’ll try to fix theory now.

Meanwhile, I just had the following issue: removed the redirect “central extension” from group extension. Saving didn’t produce any error message, but the redirect is still in effect now!

27. If anybody sees a page with a missing link where there should be one, by the way, this is probably because the renderer has not got to the affected page yet. In such a case, you can try running the following in your terminal on your computer.

curl -X PUT https://ncatlab.org/rendered_content?name=url_escaped_name_of_page

Where url_escaped_name_of_page is, usually, the name of the page with spaces replaced by +. E.g. classifying+topos for classifying topos, if there were a missing link to this page.

You will not get any output after running the command, but it will trigger the renderer for the page. After a while (or maybe very quickly, depending on which other pages in the tree of pages which link to it or to which it links have been rendered), if you refresh the page on which the link was missing, you should see that the link now appears.

If you don’t see anything even after a little while (say 10 minutes), let me know here; there is probably a rendering error. Indeed, you can check this by trying to make a trivial edit to the affected page (i.e. classifying topos in this case).

28. Re #59: thanks for mentioning this, let me check. Could easily be a bug there, because it’s not something that has been tested much.

29. I have removed the redirect manually in the meantime.

• CommentRowNumber63.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Thanks!

• CommentRowNumber64.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Now in trying to create an actual entry central extension, I am getting the timeout error that you mentioned in #53. Re-trying, I get the 500 error. Should I just wait?

• CommentRowNumber65.
• CommentAuthorRichard Williamson
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

Could you try again now? Let me know if the 500 error still occurs.

If it were not a page creation and instead just an edit, you could follow the instructions in #60 for the page you are editing and then retry after some time has elapsed, but for a page creation we do not have that option.

• CommentRowNumber66.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

just tried again creating the entry. After hitting “submit” it took a minute or two, then it produced the error message “You must use an HTTP POST “

• CommentRowNumber67.
• CommentAuthorDavid_Corfield
• CommentTimeJul 31st 2018

I had that error message this morning, but the edit had taken place.

• CommentRowNumber68.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Not in this case: central extension does not exist yet

• CommentRowNumber69.
• CommentAuthorRichard Williamson
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

I have now fixed the issue in #59, I believe.

Re #66: I see, yes, that is the timeout issue. Unfortunately, we will just have to wait for the moment (but see below). For edits, as I say, there is a workaround, but not for creates. I plan to rework the page submission process to avoid this timeout issue, but I cannot do that right now.

Here is a hack that might work. Try creating a redirect for ’central extension’ in the Sandbox, and then use the curl method in #60 on the Sandbox. This should render all pages on which central extension is linked. It will take a while, but once that process is completed, you should be remove the direct in Sandbox and create the page, I believe.

30. Actually that will end in a timeout as well, I think, but I can get around it manually. Will try shortly.

• CommentRowNumber71.
• CommentAuthorUrs
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

Hi Richard,

yes, I just tried, and it took a long time and then produced a 524 error, which flashes briefly, and then turns into “You must use an HTTP POST”. I suppose that’s the timeout error?

But it’s not so important, if this will sort itself out as the entries get automatically re-rendered (amazing how many hours that takes, isn’t it? Must be more than 10 hours?)

If you’d ask me for priority issues, for me it’s my lecture notes (sorry if I am being egoistic here, but it makes me mighty nervous). I give a lecture on the material “categories and toposes” in two weeks, with announcement starting now, and give oral exams on the material in “perturbative quantum field theory” in four weeks, with students needing it for preparation. Currently the entries are not usable. Did you see that as of today not only the \ref-s and the tables are broken, but now also the math rendering is broken?

I sure know that you have other things to do than look into that. But you are my last hope.

31. Hi Urs, I understand. I am sure we can fix the pages for your lecture notes by this evening some time. I think there is just some issue with the reference parsing that is skewing some of the other content; once the references are fixed everything, except possibly the tables (which I’ll look at separately in that eventuality), should be fine.

Yes, what you describe is the timeout error.

• CommentRowNumber73.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

All right, thanks!! I’ll owe you a beer or two once we are though with this.

• CommentRowNumber74.
• CommentAuthorRichard Williamson
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

Just to give you a little more comfort, all the individual pages which you include (e.g. geometry of physics – basic notions of category theory look fine to me. So it is indeed just some problem with the references, because that is the only thing that changes with the overall page: the actual page content is just inserted verbatim.

I had the references working correctly at one point, so it should be fixable :-).

32. I believe I have found the issue with table rendering. One would think Maruku’s rendering of tables would be idempotent, but no, if you run it on a table using the Markdown Extra syntax that we usually use, and then run it again on the result, one gets a complete mess. It should, however, be easy to fix: working on it. I am not completely sure, but I think this may also fix the references, because the main issue is just that the javascript script which handles the referencing fails due to problems with the HTML. Will update.

• CommentRowNumber76.
• CommentAuthorUrs
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

Thanks, Richard! I really appreciate it.

Yes, the \ref-s did work before. That was at #32 above, three days ago. Maybe we can remember what it is you changed after that?

There may be two issues interacting here, because while at geometry of physics – basic notions of category theory it has no \ref-s in the rendered output, some of the reference numbers just come out as whitespace (screenshot)

It roughly looks like this is an incremental effect, affecting more and more \refs as the page progresses. But not sure.

• CommentRowNumber77.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Richard, you probably saw it in another thread, but just to highlight: The problem with math not rendering appears also in a “small” entry, here. Maybe that helps with isolating the cause?

33. I think I have fixed the table rendering now, and I think the HTML looks fine now.

But references still problematic. Will keep digging. Well spotted regarding the issues in the screenshot, that is helpful to know.

The reference rendering has not changed since three days ago, but something else is interfering with it; the question is what :-). Will get there soon, I think.

• CommentRowNumber79.
• CommentAuthorRichard Williamson
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

Re: #77, that is the issue with table rendering. Will be gone once content is regenerated. Edit: hmm, actually not! Will take a look. Edit 2: same kind of issue I think, should be fixable. Will be stopping soon, continuing this evening. Edit 3: actually, I think it is a table issue after all!

34. Quick update.

1) It should be possible to create ’central extension’ now.

2) The page geometry of physics – basic notions of category theory looks correct to me now.

3) Still some issues at geometry of physics – categories and toposes, which I am continuing to look into.

35. Finally, I think geometry of physics – categories and toposes is displaying as it should. Initially, the references at geometry of physics – basic higher topos theory were not displaying, but I have tweaked Adeel’s javascript script fractionally to sort this out.

Also signs in supersymmetry is fixed.

Overall, I think we might be done. I just need to render all pages, which is proving remarkably difficult: Instiki keeps giving up after a while. Will try again overnight.

Let me know about any issues.

• CommentRowNumber82.
• CommentAuthorUrs
• CommentTimeJul 31st 2018

Thanks, Richard!

I see one remaining issue at categories and toposes: From Example 1.16 on, all hyperlinked words have disappeared (not just the hyperlinks, but the text being hyperlinked).

Checking perturbative quantum field theory I see that it still has the following issues: empty \ref-numbers, math not rendering (for instance </semantics> appears where math should be), and toc not rendering (below “Contents” it has just 1. \left\langle). But I guess you now know immediately how to fix it (maybe I should re-save)?

• CommentRowNumber83.
• CommentAuthorUrs
• CommentTimeJul 31st 2018
• (edited Jul 31st 2018)

Hm, I see that the issue of hyperlinked text not displaying is confined to the first include chapter, but opening that separately, the text does appear. (?!)

• CommentRowNumber84.
• CommentAuthorUrs
• CommentTimeAug 1st 2018

Hi Richard,

please bear with me, but one more: Just tried creating “central extension”, but it still times out.

Maybe one last round of your efforts will be necessary. Sorry for all the trouble, I am extremely grateful that you are looking into this.

• CommentRowNumber85.
• CommentAuthorUrs
• CommentTimeAug 1st 2018
• (edited Aug 1st 2018)

The following two issues from #82 seem to have been resolved: hyperlinked words disappearing and \ref not appearing! Great!

Issues that remain: ToC at perturbative quantum field theory does not appear, central extension times out when trying to create it.

Also, there remain gray links. Much fewer than just a while back, but still a fair bit of them. I keep re-saving the corresponding entries by hand.

In some cases (but not in all, it seems to me?!) this shows that the problematic entries are rejected by the new stricter parser. For instance I just fixed enriched functor which didn’t save anymore because there was a typo in one of the names of the !include-file for the floarting TOC.

• CommentRowNumber86.
• CommentAuthorUrs
• CommentTimeAug 1st 2018
• (edited Aug 1st 2018)

one “small” entry with a fair number of broken links is advanced and retarded causal propagators.

I just tried re-saving some of the linked entries. In each case I got the following:

1. first time saving I get a time-out

2. saving once more: then it works, in that it does not produce an error message – but the redirects seem still not in place

edit: no, the re-directs are in place! after double-saving the linking entry, too.

Okay, so in principle I guess I can fix all these links by hand. But since one has to wait for a time-out, it takes a lot of time…

• CommentRowNumber87.
• CommentAuthorUrs
• CommentTimeAug 1st 2018

I keep re-saving entries….

From time to time I get this message at the top of an newly loaded entry:

An error occurred when re-rendering content of some pages. Page ID: 4276. Error: ContentRenderer::ReRenderError

Clearly it’s telling me that there is some syntax issue, but I don’t know where and what!

• CommentRowNumber88.
• CommentAuthorUrs
• CommentTimeAug 1st 2018

also this:

An error occurred when re-rendering content of some pages. Page ID: 18819. Error: undefined method ‘captures’ for nil:NilClass

• CommentRowNumber89.
• CommentAuthorUrs
• CommentTimeAug 1st 2018
• (edited Aug 1st 2018)

Some links in geometry of physics – perturbative quantum field theory are broken not because of redirects, but because of a spurious whitespace in the source code.

For instance i just fixed a broken [[causal additivity]] to a working [[causal additivty]].

This particular case had a line break in the source code, right after causal, so I guess that was somehow the cause.

36. Will reply properly later. Just wished to inform that I have been working on fixing things. As you noticed, some things have now been fixed. I believe I understand how to fix the rest, will try to finish this off later. The pages are at least reasonably usable now, I believe.

The grey links are because the renderer has still not run on all pages, though it is running now, and does seem like it will go through all pages this time.

• CommentRowNumber91.
• CommentAuthorUrs
• CommentTimeAug 1st 2018

Great, Richard. Thank you. Really great.

• CommentRowNumber92.
• CommentAuthorRichard Williamson
• CommentTimeAug 1st 2018
• (edited Aug 1st 2018)

Quick update: I have tweaked things so that when one edits a page, we wait only for the immediate page; the ones which for instance have a link to it are rendered asynchronously. This allowed for example central extension to be created. In extreme cases, one may still encounter a timeout, but most entries should be fine now. I will rework some things eventually to try to avoid timeouts completely.

Reference rendering on pages with includes now seems fixed.

For some reason (I can only assume Rails’ garbage collection mechanism), the renderer refuses to run all the way through the nLab in one go, so I still have to do it manually in batches. The majority pages have now been rendered, but not all, so there will still be grey links here and there for the moment. Fewer and fewer, though.

The main thing left is to fix the table of contents on included pages. Nearly done with this, but am done for today now.

• CommentRowNumber93.
• CommentAuthorUrs
• CommentTimeAug 2nd 2018

Thanks, Richard!

At the pQFT page there are only 189 broken links left now (just counted them :-), so we are getting closer.

I just re-saved another batch of 20 of entries by hand, but some don’t want to come out. I’ll try re-saving the individual include-files once more…

• CommentRowNumber94.
• CommentAuthorUrs
• CommentTimeAug 2nd 2018
• (edited Aug 2nd 2018)

Oh, I see, those hyperlinks that are still broken seem to be mostly those where a spurious whitespace has appeared in the source code (as in #89). Guess we’ll have to hunt and fix all these by hand…

• CommentRowNumber95.
• CommentAuthorUrs
• CommentTimeAug 2nd 2018
• (edited Aug 2nd 2018)

Hah, I am down to 34 broken links at pQFT. Re-saving !include entries and removing spurious whitespace did the trick.

Except for these two remaining problems:

1. None of the redirects to Fourier transform works, and re-saving doesn’t change this (e.g. Fourier analysis and Fourier mode should be redirecting to Fourier transform, but don’t).

2. the !include-entry A first idea of quantum field theory – Fields does not allow me to save it (all remaining broken links are from here!) On attempted save, I get this error message:

 undefined method captures' for nil:NilClass
`
37. I believe that the table of contents are now working for pages with includes.

Still not all pages have been rendered. I am gradually filling them in. There are 2-3 big batches left (about 2000 pages), then we fill have to fill in the rest by hand, because they will be ones with errors.

There is a problem with rendering the page Fourier transform, which is causing problem 1. in #95. It is actually not the new renderer itself but Maruku that is complaining, which is strange. I will take a look at this and at 2. in #95.

38. Fourier transform is able to be rendered now. The pages which link to it are now being re-rendered.

39. Hmm, but the Maruku issue seems to still be preventing the redirect from being created. Will look further into it.

• CommentRowNumber99.
• CommentAuthorUrs
• CommentTimeAug 2nd 2018
• (edited Aug 2nd 2018)

Thanks!!

Do you see why … – Fields does not save?

40. Fixed Fourier transform now. Issue was a \ref{} that was not closed properly. I suspect the same at the other page, I can try to check now. Maruku’s error message is terrible in this case, but also the new renderer’s behaviour is not good; it should try to do some kind of check to validate the reference. I was thinking about that yesterday, but did not know of an example where it was problematic until now.