Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
Is there a reason for ’abelian topological group’ when ’topological abelian group’ exists already? I have only been skimmimg the stuff on condensed mathematics so am not sure if this is like ’presheaf of sets’, a concept ’with attitude’ or not.
Turning analytic geometry into algebraic geometry sounds like making things a lot harder!
Re #2, oh yes, I got confused by the abbreviation AbTop.
Re #3, well Lawvere promises something similar at bornological topos.
added these pointes – this is a neat story unfolding:
Formalization/verification in proof assistants (Lean):
Peter Scholze, Liquid tensor experiment, December 2020
Peter Scholze, Half a year of Liquid Tensor Experiment: Amazing developments, June 2021
Yes, it’s very impressive. I think the really tricky stuff is doing topological homological algebra, where exactness for instance is only up to some bounded error, and everything is graded by or something like that. This was the bit Scholze was worried about because there were so many fiddly estimates, and he apparently had no good conceptual grasp on what the proof was supposed to be doing and why it worked. Now that issue has been addressed.
I mean that it’s neat how the community joined forces to formalize this, and that it worked out, and that it seems to be the beginning of a new tradition of practical mathematics.
On the other hand, it seems a little weird that to just lay foundations for real analysis in condensed mathematics one should need a hyper-technical lemma almost beyond human grasp. But apparently that’s what the real world looks like when seen through perfectoid eyes.
On the other hand, it seems a little weird that to just lay foundations for real analysis in condensed mathematics one should need a hyper-technical lemma almost beyond human grasp.
I think this is a case of having tools which one knows how to wield (those of modern ’algebra’, i.e. topos theory, homological algebra, etc), and finding a way to be able to apply them/’forcing’ them to apply even in situations where it is not immediately obvious that one could. Probably easier and conceptually clearer ways to do it will be found in the future; it is really quite a natural idea to try to treat real analysis algebraically using sheaves on profinite sets, given that decimals are very naturally profinite.
As I have mentioned somewhere before, I see this as in the spirit of mathematics over : one is finding a way to achieve one of the major goals of mathematics over , namely to truly be able to treat number theory over the archimedean place on the same footing as non-archimedean arithmetic geometry. I still also think Mochizuki’s work has interesting insights in a different way in this direction too. Currently I have no time at all unfortunately for the exposition of Mochizuki’s work that I intermittently come back to, or for that matter some material that I started on the pro-étale topos, or anything else nLab related except for the occasional piece of tidying up, but I’ll no doubt come back to it when I have a chance! But I think major conjectures will fall in the next few decades building on these directions, probably including the Riemann hypothesis, and probably including things like being able to understand Hodge theory and étale cohomology on the same footing (as well as mixed Hodge modules and perverse sheaves, and so on), including viewing Hodge theory Galois theoretically.
Whilst I am waffling on and gazing into a crystal ball, let me also add that I predict that these developments may not only be positive, but may contribute (alongside other things of course) to mathematics becoming even more elitist than it is today, and more schismatic; and that I think Voevodsky was right to once suggest there is a danger that pure mathematics will cease to exist as a profession in some decades’ time outside of elite universities.
foundations for real analysis in condensed mathematics
I would think of it more as foundations for analytic geometry over (or ). One claimed application that has already been sorted out is Serre duality, but instead of all the analytic input, it’s purely formal six-functor-style result, all the analysis hiding in the result that is an “analytic ring”
@Richard I’m curious to know what you think of Tim Campion’s discussion here, regarding Frobenioids.
I do think that the level of sophistication at the top end of mathematics is getting a bit beyond the reach of people who don’t have personal access to the experts. Consider algebraic topology, for instance, where there were many things that you basically needed a copy of lecture notes by Hopkins, or access to him (substitute others, if desired, eg May) to learn certain things, because publishing was a bit lax (cf Barwick’s complaints about homotopy theory culture). Though resources like the nLab, but more so the Stacks project and Kerodon are really helping to make a lot of material better known. But this is off-topic, I guess.
In discussions on condensed mathematics, I’d been hoping category theorists would have something to say on the set-up. I pointed out here that there’s an apparent similarity with Lawvere’s bornology. Both parties claim to be turning functional analysis into algebraic geometry.
Scholze provided a comparison:
If you wish, condensed sets are bornological sets equipped with some extra structure related to “limit points”, where limits are understood in terms of ultrafilters
I was hoping we’d be able to see whether and how ultrafilters appear naturally here, but nothing came of some prompting.
What is the extra structure related to limit points that Scholze is alluding to?
It was taken from here.
Re #9: thanks for the question! I’ll try to avoid de-railing the discussion too much, and just give a brief reply. I don’t wish to stir up any argument, but I will say that I am not all that sympathetic with the main aspects of Tim Campion’s comments. It is great to come up with alternative formulations of things (in terms of Grothendieck fibrations in this case), but I think it is stretching a point to insist that Mochizuki should have used such and such a formulation. There are many reasons to choose a particular formulation of something, not all of which can be easily explained even in ordinary circumstances. In this very specific case, I confess to not being all that fond of wrapping things up into Grothendieck fibrations, so I actually have some sympathy anyhow for avoiding them, but this is really beside the point; I would have made the same argument even if I myself would have chosen to use Grothendieck fibrations.
I do certainly agree that much of the exposition in IUTT may in fact do more harm than good for the moment; but in the long run, if people eventually can understand Mochizuki’s work, this exposition may prove interesting and in hindsight insightful. What is needed for the moment is certainly a more direct and stripped down exposition, but if Mochizuki is not able to provide this, or to understand that his expositionary efforts are for the moment not really helping, that is not a reason to lynch him.
A final point which I have made before, so is not directly a remark on Tim Campion’s comments (although there is something of what I am protesting against there too) is that I think people are not thinking for themselves enough. It is very convenient for people to use Scholze and Stix’s arguments as some kind of ’definitive’ put down of Mochizuki’s work. But one thing is clear to me: Mochizuki’s work without doubt uses anabelian geometry in a fundamental way, in what he refers to as ’algorithms’. This aspect is completely missing in Scholze and Stix’s arguments. If definitive objections are to be made against Mochizuki’s work, the role of anabelian geometry first needs to be understood and explained. In this respect, I think that Absolute Anabelian Geometry III deserves a close reading; there is some very interesting stuff in there, even some homotopies (and viewing of natural transformations as homotopies, which is of course not just a metaphor but is perfectly justifiable)!
Thanks for your candid thoughts, Richard.
Added
In that text linked by David Corfield, are the objects which Peter Scholze calls “-categories” -categories, and the objects which he calls “anima” -groupoids?
Yes.
For a tad more see at infinity-groupoid: here.
Yes, in particular, condensed anima are simplicial condensed sets.
I only now see that the entry has the following line (dating from 30 May 2022, rev 17, due to the prolific Anonymous):
…current expositions of condensed mathematics rely…
Probably what is meant is not “current expositions” but something like: “existing constructions”. Maybe somebody could change this.
Got to say that “6-Functor” in the title of Lucas Mann’s paper was somewhat misleading to me; I was wondering why anybody would need to use 6-category-structure preserving functions between two 6-categories in complex analytic geometry.
That’s a good observation :) Maybe he wanted to make a joke.
For anyone wondering about “6-functor”: https://en.wikipedia.org/wiki/Six_operations. And Scholze uses the same terminology, though in the guise of “six-functor formalism”: https://people.mpim-bonn.mpg.de/scholze/SixFunctors.pdf
Maybe he wanted to make a joke.
As a rule of thumb in mathematical terminology, don’t attribute to elaborate intention what can easily be explained by carelessness. ;-)
Seriously, I’d urge to stick to calling it the (yoga of) six operations. Why change a standard and time-honored terminology without need or improvement?
On the other hand, I doubt that the stray terminology goes back to Gallauer 2021, as the Wikipedia entry currently suggests. For instance our entry six operations lists
Pavel Etingof, Formalism of six functors on all (coherent) D-modules (pdf)
While this (short) file is not dated, it existed already at least in Feb 2014, when I added the link (here).
1 to 33 of 33