Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics comma complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration finite foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorTobyBartels
    • CommentTimeJul 1st 2010

    I’ve added some items to mathematicscontents.

    I never did much with the contents pages, so I may not have organised this in the best way.

    • CommentRowNumber2.
    • CommentAuthorzskoda
    • CommentTimeJul 6th 2010

    It looks fine to me, except I am somewhat unhappy that “evil” is in the main math contents. It is rather specialist concept within category theory (even one school), rather than generally accepted wide subject in mathematics. But this is just a personal opinion.

    • CommentRowNumber3.
    • CommentAuthorUrs
    • CommentTimeJul 6th 2010

    I also thought that “evil” wasn’t quite a headline topic. But I don’t have strong opinions about it.

    Also physicscontents currently contains too many minor topics (mostly added by me, I should say). Eventually I want to break it up.

    • CommentRowNumber4.
    • CommentAuthorzskoda
    • CommentTimeJul 6th 2010

    Regarding that in physics part so many major topics are missing so far it is good even with the minor topics now. More canonical revision can be made once the physics part gets higher maturity. Now we could be happy with the contents.

    • CommentRowNumber5.
    • CommentAuthorTobyBartels
    • CommentTimeJul 8th 2010

    I think that evil is an important concept in the foundations of mathematics, although it is one that only becomes clear with the help of category theory (actually, higher category theory, or at least higher groupoid theory). So I put it under the foundations heading.

    • CommentRowNumber6.
    • CommentAuthorzskoda
    • CommentTimeJul 8th 2010
    • (edited Jul 8th 2010)

    There are many important missing concepts in the foundations of mathematics, e.g. recursion and recursive function, which are used much more widely (by both mainstream and applied mathematicians, as well as by logicians) than the higher organizational principle of (still discutably defined) maximal categorical weakness; then e.g. modal logics, ultrafilters, categoricity (in the sense of classification of structures) etc. If it were the table of contents of logics only than it could be proportional, but as part of mathematics TOC it is a bit extremist to state it as one of just 6 sub-subjects (or 7 if the nonstandard analysis which is listed under analysis counts). Also at some point one should have subjects like proof theory, linear logics and so on. Which are much bigger than the notion of what you call “evil” each.

    • CommentRowNumber7.
    • CommentAuthorTobyBartels
    • CommentTimeJul 8th 2010
    • (edited Jul 8th 2010)

    So your complaint is that the table of contents is incomplete? OK, I agree with that. Although as far as balance is concerned, we do tend to look at foundations from the nPOV, which affects that somewhat.

    • CommentRowNumber8.
    • CommentAuthorzskoda
    • CommentTimeJul 9th 2010
    • (edited Jul 9th 2010)

    No, this is not my complaint. My complaint is that if we go to so small items rather than the big division into subjects/areas we will have to expand too much. Such small items should be in separate TOCs for areas. So for TOC logic/foundastion and set theory I agree that evil may enter, as well as many other things like ultrafilters, categoricity, recursion and forcing. The toc are meant in my understanding mainly for those who do not know nlab very well so need help in finding material. Thus for me toc are not that important but when I advertise nlab to some guy he comes and he needs to find what is there. So it is good that toc be mainly conventional with roughly classical division of subject with few innovations. I spent tremendous effort writing to colleagues about nlab and adevrtising it and get repsonses like “I am more practical and like examples more than abstract nonsense”. So making the front/end of nlab maximally nPOV scares people. The heart should be nPOV and surface should be friendly as possible to mainstream mathematician or theoretical physicist if we want people to join in usage and contribution.

    • CommentRowNumber9.
    • CommentAuthorTodd_Trimble
    • CommentTimeJul 9th 2010

    get responses like “I am more practical and like examples more than abstract nonsense”.

    I believe that’s a very valid and valuable response. We should be putting much more effort into examples. I don’t just mean examples as laundry lists, but really thoughtfully chosen examples which help point the reader to the heart of thinking about a subject.

    I guess I generally agree with Zoran on the organization of tables of contents. Regarding

    There are many important missing concepts in the foundations of mathematics, e.g. recursion and recursive function, which are used much more widely (by both mainstream and applied mathematicians, as well as by logicians) than the higher organizational principle of (still discutably defined) maximal categorical weakness

    I first of all admit that I had to look up “discutable”, but second, think I might disagree with Zoran about the relative stature of the concept of “evil”. It’s much more than a concept: it’s a fundamental guiding principle for anyone who uses category theory (and higher category theory) and wishes to formulate a new categorical concept. It’s an overarching guiding principle, in contrast to things like recursion and forcing. But being a guiding principle, it may be tricky to place it correctly within toc. For example, is it “foundational”? Yes, if you take Lawvere’s point of view toward foundations; no, if you accept Kreisel’s.

    • CommentRowNumber10.
    • CommentAuthorUrs
    • CommentTimeJul 9th 2010

    …the concept of “evil”. It’s much more than a concept: it’s a fundamental guiding principle

    At some point we should think about phrasing it positively: It’s good to remember isomorphisms . The concept “good” is a fundamental guiding principle. :-)

    • CommentRowNumber11.
    • CommentAuthorzskoda
    • CommentTimeJul 9th 2010

    I call it a principle of maximal weakness and knew it (as most of category theorists did) years before heard of word “evil” (which I still do not accept); on the other hand I do not believe that it is still well understood at the rigorous level. I disagree that recursion is not a guiding principle; when creating proofs getting finiteness of procedures and finding levels measuring the complexity is much more widely used principle in building mathematical proofs and constructions.

    How about definability ? Can we consider it also a guiding principle ? It is much elucidated by the category theory (fibered category). If we want to know if some construction will be useful, computable and so on, we need to look various aspects like the complexity, definability, and maximal categorical weakness (non-evilness) of structures involved.

    Gluing from local objects is a general architectoral method in mathematics. The various notions of nerve (and maybe relating of spectra and reconstruction in the sense of geometry) are derivatives of that approach.

    Word categorification is much less well defined as a “principle”. As an English word it is attributed often to Yetter (right ?), but the concept is actually due much earlier works of Ehresmann who systematically thought about it in his times. Homotopification looks to be even younger and independently introduced as a term by many people. In practice it has been extensively used already in 1970-s about the time of Boardman-Vogt book. Finally to end the historical remarks, I hope everybody knows that Zorn used to emphasis that Zorn’s lemma is not due him; it has been many times used before him; and if I understand the story right he himself did not formulate it particularly, but just used like many others in some proof.

    Today I looked at some Oxford companion monograph on philosophy of mathematics: there are tens of items in the index about Dedekind (let’s philosophise about numbers and names, so many chapetrs and school about the two), but no mention of Grothendieck in the monograph. New monograph.

    • CommentRowNumber12.
    • CommentAuthorTobyBartels
    • CommentTimeJul 9th 2010

    The heart should be nPOV and surface should be friendly as possible to mainstream mathematician or theoretical physicist if we want people to join in usage and contribution.

    Yes, this is a good point.

    Let me explain why I edited the mathematics contents at all. (I usually ignore the contents listings.) I did not edit it to list evil (which was an afterthought) but to list constructive mathematics. And I did so because I received an email from a mathematician who wanted the nLab to be more user-friendly and suggested having constructive mathematics listed in the mathematics contents. But of course, while it was not a category theorist who wrote me, it was not exactly a mainstream mathematician either, but instead a constructivist!

    So I’ll agree with your point that the mathematics contents should be friendly to mainstream mathematicians, but it should also be friendly to more outré mathematicians, especially ones that would be likely to find the nnPOV useful.

    • CommentRowNumber13.
    • CommentAuthorTodd_Trimble
    • CommentTimeJul 10th 2010
    • (edited Jul 10th 2010)

    Zoran, your remarks on “guiding principles” are interesting. Here are some reactions.

    To get one thing out of the way: how long you and others have known about the principle of “maximal categorical weakness” is irrelevant to this discussion (and I hope you’re not suggesting it’s new with anyone in particular). Much more importantly, how rigorously established this principle is also appears to me to be somewhat irrelevant. For it’s a truism that generally speaking, heuristics are not rigorous; that’s almost by definition. The same applies to your remark about how well-defined categorification is (and indeed it is notoriously ill-defined).

    I’m not sure I understand “recursion is a guiding principle”. You wrote

    I disagree that recursion is not a guiding principle; when creating proofs getting finiteness of procedures and finding levels measuring the complexity is much more widely used principle in building mathematical proofs and constructions.

    The principle that proofs are recursive, i.e., are constructed recursively starting from axioms and applying rules of deduction, is just the definition of (formal) proof. It doesn’t give a guide about how to go about constructing proofs for example. One can use tools of recursion theory to measure the proof-theoretic strength or complexity of formal systems, and maybe use the results of a recursive analysis as a guide in theory selection (thinking here of reverse mathematics for example); is that closer to what you meant? If so, maybe I understand a little better, but then what is the principle? Would it come down to saying something like, “in theory or concept-formation, one ought to examine issues of complexity, expressivity, definability, computability, maximal categorical weakness, etc.”? (Not much to disagree with there! But it fits my rough idea of “guiding principle” anyway.)

    Further discussion might be required to distinguish “method” from “guiding principle”, but the other examples you give are intreresting and there’s a decent chance one can work each of them into enunciations of guiding principles. (But I won’t try here and now.)

    A small remark that categorification has historical roots dating farther back than Ehresmann. Thinking for example of Emmy Noether’s observation that numerical invariants like Betti numbers are shadows of deeper structural invariants like homology groups. (Edit: or, going much further back in parable form, there is the legendary shepherd from the Categorification paper.) I am not competent to judge how systematically Noether and others of that era considered such phenomena.

    • CommentRowNumber14.
    • CommentAuthorzskoda
    • CommentTimeJul 13th 2010

    proofs are recursive, i.e., are constructed recursively starting from axioms and applying rules of deduction, is just the definition of (formal) proof. It doesn’t give a guide about how to go about constructing proofs for example. One can use tools of recursion theory to measure the proof-theoretic strength or complexity of formal systems

    I did not mean only the recursiveness at the level of formal proof but the methods of controlling and reducing gradually some complexity parameter. For example you take the Širšov-Bergmann diamond lemma to prove that something is a normal form. This is really hi level, not the low level of writing the proof in terms of elementary terms.

    • CommentRowNumber15.
    • CommentAuthorTodd_Trimble
    • CommentTimeJul 13th 2010

    Well, I’m just a low-level kind of guy. ;-)

    For example you take the Širšov-Bergmann diamond lemma

    Is that what Americans call the Church-Rosser confluence property?

    • CommentRowNumber16.
    • CommentAuthorzskoda
    • CommentTimeJul 13th 2010
    • (edited Jul 13th 2010)

    Shame on me but I am not competent to answer off-hand though Church was my scientific grandfather. But after a look at wikipedia, I would rather guess no. There is also a much simpler fact, the Newman’s diamond lemma, quoted at the beginning of Bergmann’s article.

    • CommentRowNumber17.
    • CommentAuthorzskoda
    • CommentTimeJul 13th 2010
    • (edited Jul 13th 2010)

    It would be interesting to clear out of course what is the exact relationship; and what is the common generalization. You should tell me what is the relation between the lambda calculus and universal algebra to get closer to the question. Bergmann’s phrasing is for assocaitive algebras, while Širšov’s earlier result is more general. Bergmann quotes Širšov’s paper, but presumably did not notice that his main result was inside.

    • CommentRowNumber18.
    • CommentAuthorTodd_Trimble
    • CommentTimeJul 14th 2010
    • (edited Jul 14th 2010)

    The “Church-Rosser” (or confluence) property was indeed originally applied to a standard normalization scheme in lambda calculus, but since that time it has come to mean something much more general. Here’s how I understand it (and I don’t claim everything I say is 100% standard).

    Suppose one has a presentation of an essentially algebraic structure, so that the elements of the structure are equivalence classes of well-formed terms. A normalization scheme is provided by a relation \rightsquigarrow on terms such that sts \rightsquigarrow t implies ss is equivalent to tt (we think of tt as a reduction of ss), and any two terms in an equivalence class are connected by a zig-zag of instances of reduction. The normalization is Church-Rosser (or confluent) if the reflexive transitive closure of the reduction relation is the underlying order of a filtered poset (so that any span can be completed to a diamond). We say the scheme is strongly normalizing if every sequence of reductions terminates in finitely many steps; often, this is guaranteed by defining a suitable rank on terms such that reduction strictly lowers the rank.

    Under these conditions, every equivalence class of terms has a unique normal form to which every term is reducible in finitely many steps (a term is normal if it cannot be further reduced). This is a classic lemma. In the untyped lambda calculus, the standard reduction scheme is Church-Rosser but it not strongly normalizing (indeed, the classic example is the fixpoint combinator YY which reduces only to itself, so that reduction never halts). On the other hand, many presentations of essentially algebraic structures (such as polynomial algebras, free Boolean algebras, various sequent calculi) do admit a confluent strongly normalizing reduction, and this in effect means that equivalence of terms is decidable: just check whether two terms have the same normal form.

    It wouldn’t surprise me if all this were discovered independently many times, but in America the term “Church-Rosser” is well understood to refer to this diamond property of reduction.

    • CommentRowNumber19.
    • CommentAuthorzskoda
    • CommentTimeJul 14th 2010
    • (edited Jul 14th 2010)

    Now the basic question is what is the term. Newman’s diamond lemma in combinatorics/graph theory is simple statement on using the diamond property, while Bergman’s version has some fine points because of refined setup working with sums of monomials, while the semigroup order respected by the reductions (monomial into polynomial) is on monomials and not on the sums of monomials. My impression is that the Church-Rosser does not concern the subtlety of using semigroup order on monomials in order to make conclusion on normalizing the finite sums; the diamond property for both inclusion and overlap ambiguities is an ingredient but not the full story.

    • CommentRowNumber20.
    • CommentAuthorzskoda
    • CommentTimeJul 14th 2010
    • (edited Jul 14th 2010)

    I have transferred a part of one of the appendices in my thesis to diamond lemma (zoranskoda) concerning the Bergman’s diamond lemma.

    Bergman said in his paper:

    Our main result, Theorem 1.2, is an analog and strengthening of the above observations (Newman’s diamond lemma) for the case of associative rings, with reduction procedures of the form sketched earlier. It is self-contained and does not follow Newman’s graph theoretic formulation. The strengthening lies in the result that the analogs of conditions (i) and (ii) need only be verified for monomials, and in fact that (ii) need only be verified for “minimal nontrivial ambiguously reducible monomials”. That is, it suffices to check, for each monomial which can be written as ABCABC with either AB=W σAB= W_\sigma, BC=W τBC= W_\tau, (σ,τS\sigma, \tau \in S, B1B\neq 1) or ABC=W σABC= W_\sigma, B=W τB= W_\tau, (στS\sigma\neq \tau \in S) that the two expressions to which ABCABC reduces (in the first case, f σCf_\sigma C and Af τAf_\tau; in the second f σf_\sigma and Af τCAf_\tau C) can be reduced to a common value.

    • CommentRowNumber21.
    • CommentAuthorzskoda
    • CommentTimeJul 14th 2010

    but in America the term “Church-Rosser” is well understood to refer to this diamond property of reduction

    In American ring theory community one talks about Bergman’s lemma meaning that each overlap or inclusion ambiguity of reductions from monomials to polynomials in a free associative algebra can be resolved by a diamond formed with help of additional reductions. In commutative algebra there are related algorithms on so/called Groebner bases (those people do not know of Bergman’s lemma) which are related but the logics and conditions are a bit different.

    • CommentRowNumber22.
    • CommentAuthorTodd_Trimble
    • CommentTimeJul 14th 2010
    • (edited Jul 14th 2010)

    Yes. As I say, “Church-Rosser” is by now understood to be a very general term, at least in the North American logic community. I guess by now we’re agreeing that we’re talking about basically the same thing, even though it might go by different names in particular cases where there are extra subtleties and refinements. (I didn’t recognize the name Bergmann or Bergman back in #14, but by now I guess you mean George Bergman at Berkeley.)

    To my mind, Mac Lane’s original method of proof of his coherence theorem is another case in point, except that here it’s a categorification of a very simple normalization scheme for terms in free monoids. In other words, the normalization sts \rightsquigarrow t on the binary magma terms which shift parentheses from left to right isn’t just a relation, but rather the arrows are associativity isomorphisms; the diamonds aren’t just diamonds in a poset, they have to be commutative squares in the free monoidal category. Otherwise the overall strategy is essentially the same.

    One idea I had when I was first trying to define weak nn-categories was, vaguely speaking, try and categorify this into higher dimensions. Orient the cells of the associahedra so that you can interpret the associahedra as parity complexes, and the orientations are “arrows” which reduce a suitable rank. Coherence is proven with the help of the observation that the nn-skeleta of all associahedra are nn-connected.

    • CommentRowNumber23.
    • CommentAuthorzskoda
    • CommentTimeJul 14th 2010

    I am sorry for creating the confusion with Bergmann instead of Bergman. The thing is that I remember that one of my heroes has double n (it was Bargmann with his paper on Fock-Bargmann space and related coherent states which I also used in the thesis – funny enough they are related to Bargman domains in complex analysis – another Bargman with a and single n).

    I did hint in 16 that it is all related to the combinatorics of a common simpler result which I know as the Newman’s diamond lemma, quoted at the beginning of the Bergman’s Advances article, but it was and stays the impression that it is not the same statement. I know that the MacLane’s coherence can be obtained as a result of the type of theory of rewriting rules (the latter I know of because of a period when I intensively studied compiler writing) and some time ago I entered the reference to an amazing recent article of Tibor Beke into nlab, e.g. into the entry rewriting.

    Still I think we should be very careful with the subtleties of this algorithmic field and distinguish versions of similar reasoning. For example, in my survey

    • Noncommutative localization in noncommutative geometry, London Math. Society Lecture Note Series 330, ed. A. Ranicki; pp. 220–313, math.QA/0403276.

    I wrote a counterexample (in the section on practical criteria) to the proof of Ore property by induction on generators of the algebra and on multiplicative generators of the supposed Ore set (by the way, the proof also uses the Bergman’s diamond lemma in inferring the structure of the considered counterexample). This has been widely used without a proof and one of the appendices in my thesis has a wrong result which I did not use in the bulk of the thesis, though I intended to at a crucial place. I had instead a corrected result, where additional control of the size of new elements generatd from the partial Ore condition is taken into account (there is also one version there which needs a correction, thanks to a Spanish PhD student who found that one of my statements in the section is too strong).

    • CommentRowNumber24.
    • CommentAuthorzskoda
    • CommentTimeJul 14th 2010
    • (edited Jul 14th 2010)

    Not to make a wrong impression, I would be happy if the history is older and if the results can be made into logical order, but insist that we understand all the subtleties before hasty merging.

    Edit: I created person entry George Bergman.

    • CommentRowNumber25.
    • CommentAuthorDavid_Corfield
    • CommentTimeJul 14th 2010

    Zoran:

    Today I looked at some Oxford companion monograph on philosophy of mathematics: there are tens of items in the index about Dedekind (let’s philosophise about numbers and names, so many chapetrs and school about the two), but no mention of Grothendieck in the monograph. New monograph.

    Sad isn’t it that philosophers looking at mathematics of the past 50 years don’t get invited to write. Not that there are too many. Colin McLarty is best for Grothendieck: The Rising Sea: Grothendieck on Simplicity and Generality.

    • CommentRowNumber26.
    • CommentAuthorTobyBartels
    • CommentTimeJul 15th 2010

    The link above should be The Rising Sea: Grothendieck on Simplicity and Generality (missing http://).

    • CommentRowNumber27.
    • CommentAuthorzskoda
    • CommentTimeJul 15th 2010

    I mentioned the experience quoted in 25 in my (usually inactive) blog, June 14, 2010 entry on books.