Not signed in (Sign In)

Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

  • Sign in using OpenID

Site Tag Cloud

2-category 2-category-theory abelian-categories adjoint algebra algebraic algebraic-geometry algebraic-topology analysis analytic-geometry arithmetic arithmetic-geometry book bundles calculus categorical categories category category-theory chern-weil-theory cohesion cohesive-homotopy-type-theory cohomology colimits combinatorics complex complex-geometry computable-mathematics computer-science constructive cosmology deformation-theory descent diagrams differential differential-cohomology differential-equations differential-geometry digraphs duality elliptic-cohomology enriched fibration foundation foundations functional-analysis functor gauge-theory gebra geometric-quantization geometry graph graphs gravity grothendieck group group-theory harmonic-analysis higher higher-algebra higher-category-theory higher-differential-geometry higher-geometry higher-lie-theory higher-topos-theory homological homological-algebra homotopy homotopy-theory homotopy-type-theory index-theory integration integration-theory internal-categories k-theory lie-theory limits linear linear-algebra locale localization logic mathematics measure measure-theory modal modal-logic model model-category-theory monad monads monoidal monoidal-category-theory morphism motives motivic-cohomology nlab noncommutative noncommutative-geometry number-theory of operads operator operator-algebra order-theory pages pasting philosophy physics pro-object probability probability-theory quantization quantum quantum-field quantum-field-theory quantum-mechanics quantum-physics quantum-theory question representation representation-theory riemannian-geometry scheme schemes set set-theory sheaf simplicial space spin-geometry stable-homotopy-theory stack string string-theory superalgebra supergeometry svg symplectic-geometry synthetic-differential-geometry terminology theory topology topos topos-theory tqft type type-theory universal variational-calculus

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

Welcome to nForum
If you want to take part in these discussions either sign in now (if you have an account), apply for one now (if you don't).
    • CommentRowNumber1.
    • CommentAuthorTobyBartels
    • CommentTimeDec 22nd 2010

    I put a complete definition at linear logic.

    • CommentRowNumber2.
    • CommentAuthorMike Shulman
    • CommentTimeDec 23rd 2010

    Do you know where the pronunciations “of course” and “why not” for the linear exponentials come from?

    • CommentRowNumber3.
    • CommentAuthorTim_Porter
    • CommentTimeDec 23rd 2010

    Do you mean who used them first? (and I am not sure) or why they are used and that is the resource sensitive nature of LL. It would be nice to have an entry for that. It is on my list of TTD (<- things to do) to expand the entry on Petri nets to try to explain at an elementary level how the transparent simplistic model of Petri nets leads to LL since the higher dimensional automata of Goubault and Mimram’s paper (i.e. labelled cubical sets with a tiny bit of extra structure) generalise Petri nets, and the dream would be to find new interpretations of HDAs in resource senstive situations, i.e. Operational research from the nPOV. (<- That would be fun!).

    • CommentRowNumber4.
    • CommentAuthorTodd_Trimble
    • CommentTimeDec 23rd 2010

    Do you know where the pronunciations “of course” and “why not” for the linear exponentials come from?

    I would have assumed they came from Girard. His original 1987 paper might have information on this.

    • CommentRowNumber5.
    • CommentAuthorMike Shulman
    • CommentTimeDec 23rd 2010

    What do “of course” and “why not” have to do with resource-sensitivity? I thought “!A” meant “an arbitrarily reusable version of A” – what is “of course” about that?

    • CommentRowNumber6.
    • CommentAuthorTobyBartels
    • CommentTimeDec 24th 2010
    • (edited Dec 24th 2010)

    Of course! You must accept whatever number of AA that I give you.

    Why not? You can take whatever number of AA that you wish.

    Yes, it’s still not a very good fit. And I don’t know if that’s where it came from. But that’s how I interpret it.

    • CommentRowNumber7.
    • CommentAuthorTobyBartels
    • CommentTimeDec 24th 2010

    Following Girard in Proofs and Types, I’ve made the link to ‘coherent spaces’ point to coherence space, so that ‘the topological kind’ can be at coherent space.

    In any case, we’ll need disambiguation notes when we create these pages.

    • CommentRowNumber8.
    • CommentAuthorTobyBartels
    • CommentTimeDec 24th 2010

    More stuff added, including game semantics.

    • CommentRowNumber9.
    • CommentAuthorMike Shulman
    • CommentTimeDec 24th 2010

    Part of the problem is that in natural language (at least in my idiolect) there’s not much difference between “of course” and “why not”. They’re certainly not “dual” in any sense I can think of.

    • CommentRowNumber10.
    • CommentAuthorTobyBartels
    • CommentTimeDec 24th 2010
    • (edited Dec 24th 2010)

    They don’t mean the same thing in my idiolect. While I’m unsure how well they fit exponential operators at all, the difference between them fits the difference between these operators fairly well. It’s a question of who’s in control.

    With ‘of course!’, I have no choice; with ‘why not?’ I do. (If they are said to me.)

    • CommentRowNumber11.
    • CommentAuthorTodd_Trimble
    • CommentTimeDec 27th 2010
    • (edited Dec 27th 2010)

    Here is an attempt to get at the duality:

    !X! X = “Of course XX” could mean “(at any time) there is every reason for item XX to be available”

    ?X? X = “Why not XX” could mean “(at any time) there is no reason for item XX not to be available” = ¬!¬X\neg ! \neg X

    The “at any time” is an attempt to get at the modal nature of these operators (according to Girard).

    • CommentRowNumber12.
    • CommentAuthorTim_Porter
    • CommentTimeDec 27th 2010

    I was just chatting with Phil Scott and asked him the question from no. 2 above. He says that Girard has a longish discussion of ’bien sur’ and ’pourquoi pas’ in the original papers. There is an interpretation of !X as a Xerox machine, but he makes the distinction between the machine and the copies it produces! Phil says he thinks of ’why not’ as a sink, but I am not sure why.

    • CommentRowNumber13.
    • CommentAuthorMike Shulman
    • CommentTimeDec 27th 2010

    @Toby: In what context are you imagining “of course” and “why not” being said to you?

    @Todd: Wouldn’t the negation of “there is every reason for X not to be available” be “there is some reason for X to be available”?

    @Tim: Do “bien sur” and “pourquoi pas” mean exactly the same thing in French that “of course” and “why not” do in English?

    • CommentRowNumber14.
    • CommentAuthorTodd_Trimble
    • CommentTimeDec 27th 2010

    Mike, the “every” was a rhetorical flourish, not a logical operator. Let’s then excise the word.

    • CommentRowNumber15.
    • CommentAuthorTobyBartels
    • CommentTimeDec 27th 2010
    • (edited Dec 27th 2010)

    @ Mike

    In what context are you imagining “of course” and “why not” being said to you?

    Something like in response to my question “Do I have a[nother] copy of AA?”

    This doesn’t really fit the nature of the exponential operators. If we’re in !A!{A}, and I ask whether I have another copy of AA, the answer is not always Yes. If the answer is Yes, then this answer may be phrased as “Of course!”, since it is not due to my choice that the answer is Yes. On the other hand, the answer might be No (or “not yet”), in which case it’s really more like “Of course not!”. On the other hand, if we’re in ?A?{A}, and I ask whether I have another copy of AA, the best answer is “It’s up to you.”. Answering “Why not?”, again, has the flaw of suggesting Yes rather than No. However, the final decision is mine, which is the important part.

    That’s why I wrote

    While I’m unsure how well they fit exponential operators at all, the difference between them fits the difference between these operators fairly well.


    Do “bien sur” and “pourquoi pas” mean exactly the same thing in French that “of course” and “why not” do in English?

    Close enough that dictionaries translate them so, almost exclusively each way; more than that I can’t say.

    • CommentRowNumber16.
    • CommentAuthorTim_Porter
    • CommentTimeDec 28th 2010

    Do “bien sur” and “pourquoi pas” mean exactly the same thing in French that “of course” and “why not” do in English?

    Essentially yes, as far as I know.

    If we’re in !A, and I ask whether I have another copy of A, the answer is not always Yes. If the answer is Yes, then this answer may be phrased as “Of course!”, since it is not due to my choice that the answer is Yes.

    If Girard’s Xerox machine analogy is to be used perhaps the key word to look at may be ’available’. Phil told me that Grard’s paper is on the web. I think he means here, but I cannot find the Xerox machine analogy. (Search does not work as it is a scanned copy.) There is mention of !A meaning that you have a copy of A stored for possible use and some discussion of the meaning of this with regard to ’finite’ or ’infinite’ storage if I understand what he means. There is further discussion here and a lotof other stuff on Girard’s homepage.

    • CommentRowNumber17.
    • CommentAuthorMike Shulman
    • CommentTimeDec 29th 2010

    @Todd: Sorry if it seems like I’m being dense or persnickety, but I really am perpetually confused about what these things are supposed to mean. If we remove “every” from your statement then !X becomes “there is reason for X to be available.” Is it correct to interpret that as an existential quantifier “there is some reason for X to be available”? Then the double negation would indeed look like “there is no reason for X to not be available,” as long as plain “X” is regarded as interchangable with “X is available” (and in particular “X is not available” is to be identified with “not-X is available,” which seems odd if X is a proposition, but I suppose if X is a resource then the latter version is nonsense? Although in that case it is unclear to me what double-negation is supposed to mean— “the non-availability of X is not available”?).

    @Toby: when you say “we’re in !A” should I take that to mean that !A is part of the context and we are deducing consequences from it? If so, then I would think the answer to “do I have A?” would always be yes, since !A ⊢ A is an axiom. I must be misunderstanding something still.

    • CommentRowNumber18.
    • CommentAuthorTodd_Trimble
    • CommentTimeDec 29th 2010

    Mike, I’m sorry I was confusing. I think I was overstretching a bit.

    What I had written was really an attempt to give natural-language translations of the phrases “of course” and “why not”. For example, I had written “there is every reason for X to be available”, and I meant that in the same way as when you hear someone start a sentence with “there is every reason to believe…” – certainly one doesn’t mean “every” here with its usual logical meaning. Try again reading what I had written in back in #11, excising instead all the mathematical symbolism, and read my comment as a possible attempt to translate some Girardese into natural language, still at a pre-mathematical level.

    I did insert some mathematical symbolism, as an afterthought, thinking what I had written could possibly be mathematicized, which is what you’re doing now: trying to make mathematical sense of what I had written. I don’t think you’re being persnickety; I think what you’re showing is that what I had written doesn’t yet hold up to mathematical analysis. So please take what I had written as still half-baked. I’ll have to think about it some more before I attempt a more mathematical translation.

    It’s possible too that “of course” and “why not” (or “bien sur” and “pourquois pas” as the case may be) are just plain ineptly chosen for Girard’s purposes, or (put more nicely) is anyway his best attempt at a colloquial approximation of the underlying mathematics.

    • CommentRowNumber19.
    • CommentAuthorMike Shulman
    • CommentTimeDec 29th 2010

    I see, thanks!

    • CommentRowNumber20.
    • CommentAuthorTobyBartels
    • CommentTimeDec 30th 2010

    @ Mike

    when you say “we’re in !A” should I take that to mean that !A is part of the context and we are deducing consequences from it?

    No, I meant if !A!{A} is the statement whose meaning we are trying to understand. I tend to think of linear logic always in the empty context, by moving everything over to right-hand side of the turnstile (using negation). Also, by how many AAs we ‘have’ I meant how many AAs are in the statement as we understand it, but this was dangerous; if I really want to get into the game semantics (where, at least on the right-hand side, !! and other conjunctions are not under my control, while ?? and other disjunctions are), then I should have used a phrase like ‘I need’ or ‘is required’ rather than ‘I have’. I was thinking very syntactically when I wrote ‘have’, thinking about the marks on the paper.

    However, if you want to think in terms of contexts instead, then you certainly can; thanks to the magic of linear logic, everything works exactly the same way, only backwards. And we can still use game semantics, in which case ‘have’ is a fine word for things on the left-hand side. The only difference is that who is in control switches. So on the left-hand side, it is my opponent who asks ‘Do you have a[nother] copy of AA?’ and I who answer ‘Of course [not]!’.

    So looking at !AA!{A} \vdash A, this is a valid sequent because, when my opponents asks whether I have another copy of AA, I first answer that I do, and use it to complete my requirement to find a copy of AA. Thereafter, I answer No.

    Equivalently, I can turn this into the empty-context, single-proposition sequent ?A A\vdash ?{A^\perp} \parr A. This is valid (so ?A A?{A^\perp} \parr A is a proposition that I can prove, or a game that I can win), because now when I ask whether I need a copy of A A^\perp, the reply is ‘Why not?’, and I decide (first) that I do. Since a need for A A^\perp is the same thing as the right to demand a copy of AA, I use this demand to get a copy of AA, and then1 use that to fulfil my requirement to provide a copy of AA (which requirement is on the right side of \parr).


    1. Note that \parr leaves the order of play up to me. In contrast, ?A A\nvdash ?{A^\perp} \otimes A, because I might be forced to produce AA before I can get one. Of course, !A A\nvdash !{A^\perp} \parr A also, since I might not get an AA or might get too many. 

    • CommentRowNumber21.
    • CommentAuthorMike Shulman
    • CommentTimeJan 1st 2011

    Ah, I see, you meant “in” in the sense of game semantics: “the state of play is …”.

    I think that maybe I get it a little bit now. “Why not?” (or something more expressive like “oh, why not?”) is an (affirmative) answer that I might give to a question like “shall we play (another) game of A?” in a context where I could perfectly well have chosen not to. Similarly, if the state of play is ?A, then I’m sort of continually being asked “shall we play (another) game of A?”. But it doesn’t quite fit because a response of “why not?” means to me that I am saying yes, even though I could have said no, while when the state of play is ?A, I haven’t answered yet (right?).

    I think I can also see “of course!” as an (affirmative) answer that I would give to “shall we play (another) game of A?” when I didn’t really have any choice, but was just being asked for politeness’ sake — something more like the “of course, sir!” that a butler might say when the master of the house asks him to do something. Similarly if the state of play is !A, then the opponent might ask me to play a game of A at any point, and I would have to agree.

    The description of game semantics on the nLab page is a bit vague about the exponentials, though. Do the “several ways” produce honestly different game semantics? If so, is there a “most common” or “easiest” one that we could explain fully, and then mention the existence of variants?

    • CommentRowNumber22.
    • CommentAuthorTim_Porter
    • CommentTimeJan 1st 2011
    • (edited Jan 1st 2011)

    @Mike The resource based semantics gives ’Of course’ not as you suggest but more ’it is there whenever you need it.’

    • CommentRowNumber23.
    • CommentAuthorTim_Porter
    • CommentTimeJan 1st 2011
    • (edited Jan 1st 2011)

    I did not stay up late last night but I seem to be seeing the converse of the three types of mathematicians joke in the Game Semantics:

    At any given moment in a game, exactly one of these three situations obtains: it is our turn, it is their turn, we have won, or they have won

    That makes four unless I am dreaming.

    • CommentRowNumber24.
    • CommentAuthorTobyBartels
    • CommentTimeJan 2nd 2011
    • (edited Jan 2nd 2011)

    @ Tim:

    That makes four unless I am dreaming.

    Yes, that used to be three situations (it is our turn, it is their turn, the game has ended), with the last later broken into two. I decided that it would be better to give all four results at once, especially since they correspond to the four constants in linear logic. Fixed.

    The resource based semantics gives ’Of course’ not as you suggest but more ’it is there whenever you need it.’

    This fits phrasing the game semantics of ?A?{A} pretty well too; another game is there whenever you want it.

    Given any two random semantics of linear logic, they are as likely to fit together dually as to fit together exactly.

    @ Mike:

    The description of game semantics on the nLab page is a bit vague about the exponentials, though. Do the “several ways” produce honestly different game semantics? If so, is there a “most common” or “easiest” one that we could explain fully, and then mention the existence of variants?

    As I understand it, they are honestly different, although I’m not sure how much. I don’t know if any is most common or easiest. I hope to find Blass (1992) at the library, so if his version makes sense, I’ll probably write about it as standard. So far, I’ve only read secondary sources that cite Blass. (I need to figure out which ones were helpful and add those to the references.)

    • CommentRowNumber25.
    • CommentAuthorMike Shulman
    • CommentTimeMay 7th 2012

    I added a bit of stuff to linear logic: some mention in the “Idea” section of interpretations other than resource semantics, some comments about different terminology and notation used in the related field of relevant logic, and a brief discussion of polycategories as an alternative approach to categorical semantics (which greatly clarifies to my mind the mysterious behavior of \parr). Unfortunately I don’t have time to write more than a stub at polycategory at the moment.

    • CommentRowNumber26.
    • CommentAuthorSridharRamesh
    • CommentTimeMay 7th 2012

    A question about polycategories: are they necessarily symmetric (in that the order of objects in the domain or codomain list doesn’t matter)? If not, I’m not sure I understand very well what their composition operations are supposed to be, from the given example.

    • CommentRowNumber27.
    • CommentAuthorMike Shulman
    • CommentTimeMay 7th 2012

    Good question! I think by default they are symmetric. There may be a non-symmetric version, but I don’t know what the right way to phrase it would be – what sorts of composition you would allow. Maybe only “planar” ones (which would disallow the example I gave)?

    • CommentRowNumber28.
    • CommentAuthorMike Shulman
    • CommentTimeMay 24th 2012

    Toby, is there a reason you used colons to separate formulas in contexts at linear logic? Most people use commas, and using colons makes it difficult to add variables and terms (usually denoted by x:Ax:A, of course).

    • CommentRowNumber29.
    • CommentAuthorTobyBartels
    • CommentTimeMay 28th 2012
    • (edited May 28th 2012)

    I have no idea. It might ultimately derive from programming notation for lists, but if so, it’s not being used correctly anyway. I’ve changed it to commas.

    However, typing really should be “x:Ax\colon A” rather than “x:Ax : A”.

    • CommentRowNumber30.
    • CommentAuthorMike Shulman
    • CommentTimeMay 29th 2012

    However, typing really should be “x:Ax\colon A” rather than “x:Ax : A”.

    I’ve been using \colon for typing judgments instead of : ever since you convinced me that was the right thing to do. IIRC the argument was that we use \colon for functions, $f\colon A\to B$, which is a particular sort of typing judgment, ergo, we should use \colon for all typing judgments.

    At Swansea, however, Jamie Vicary took me to task for this, saying something along the lines of that the spacing in my typing judgments was ugly. And indeed, when I stepped back and cleared my mind of the dogma that “all typing judgments should be \colon” I found that usually, I did actually prefer the look of :. This is especially true when either the type or the term is a large expression; \colon looks okay when both are small (e.g. the term is a variable and the type is $A$ or $A\times B$ or $A\to B$), but when the type or, especially, the term is quite long, then I think it looks clearer to have more space around the colon.

    Do you have a good reason for using \colon rather than :?

    • CommentRowNumber31.
    • CommentAuthorZhen Lin
    • CommentTimeMay 29th 2012

    In ordinary TeX, \colon is a \mathpunct, so it gets spaces after but not before, whereas : is a \mathrel, so it gets spaces before and after. I think \mathpunct-style spacing for typing declaration of functions used to be traditional, but nowadays I mostly see \mathrel-type spacing. (On the other hand, Sheaves in geometry and logic seems to exclusively use \mathpunct-style spacing, even for set formation {:}\{ \cdots \colon \cdots \}!)

    • CommentRowNumber32.
    • CommentAuthorTobyBartels
    • CommentTimeMay 30th 2012

    @ Mike: Pace Zhen Lin, maybe I’m just old fashioned. I ultimately rely on the authority of Knuth for typing assertions of functions, but maybe that is old fashioned too. The only thing that I’m really confident about is that the colon in ‘f:ABf\colon A \to B’ is the same colon as in ‘x:Ax\colon A’.

    I do like that the mathpunct colon has only this meaning (AFAIK) in contrast to the mathrel colon with various other meanings, but in this case I wouldn’t expect conflict.

    • CommentRowNumber33.
    • CommentAuthorMike Shulman
    • CommentTimeMay 30th 2012

    Okay, thanks for the thoughts. In that case I’ll probably start switching back to the mathrel colon for typing judgments.

    • CommentRowNumber34.
    • CommentAuthorTobyBartels
    • CommentTimeJun 2nd 2012

    Now that someone’s made a deliberate choice to use mathrel colons for type declarations (including for function types), I should probably stop correcting them to mathpunct colons.

    • CommentRowNumber35.
    • CommentAuthorUrs
    • CommentTimeJan 7th 2014

    I have added to linear logic at the beginning a quick paragraph pointing out that under thinking of linear logic as quantum logic, the linearity of the logic is the no-cloning theorem of quantum physics.

    I put this at the top of the list of subsections of the Idea-section, because I feel this gives by and large the best idea of what’s actually going on.

    • CommentRowNumber36.
    • CommentAuthorTim_Porter
    • CommentTimeJan 7th 2014

    I felt the ideas section started with a deep plunge so have added a sentence introducing the section. I like the fact that the ideas section has several parts, as this indicates the complex interactions between the intuitions coming from the various parts.

    • CommentRowNumber37.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 7th 2014

    In linear logic, with regard to calling propositions AA and BB “equivalent” if ABA \vdash B and BAB \vdash A are derivable, I thought it made sense to rename this “propositional equivalence”. I also added a remark on a more refined notion of Lambek-equivalence where we pay attention to the categorical semantics of sequent proofs, and that there was a more refined notion of equivalence between propositions in terms of (specified) isomorphisms S(A)S(B)S(A) \cong S(B) under semantics SS. I also mentioned that all the propositional equivalences listed below the remark (coercions involving the negation operator - is ’coercion’ the right word here?) were in fact instances of this stronger notion of Lambek-equivalence.

    I also added a query box about saying string diagrams ’are’ proof nets. I feel it is not at all clear what this really means, and that we should get it sorted out. Let the query box remain as a goad for us to do this. I would like, for one thing, a reference for where this is clearly explained, if such exists.

    • CommentRowNumber38.
    • CommentAuthorUrs
    • CommentTimeFeb 7th 2014
    • (edited Feb 7th 2014)

    about saying string diagrams ’are’ proof nets.

    Todd, we just discussed this in another thread. I am fine with your complaint that the word “are” here is inaccurate. There is no further reference I’d have in my sleeve that would say this and which we haven’t already talked about. Please, just remove the “are” as we did elsewhere, and replace it by something weaker.

    That said, and with all respect for being super-accurate in nnLab entries even where they just point to other entries, let me express one feeling: I think if there are two fields of mathematics which are so very similar and nevertheless coexist without many researchers in one field even knowing about the other – almost like those parallel universes that people imaging are “just a millimeter apart” – then I feel the first priority is to stress the similarity, and the technical fine detail might wait for a second for the important broad message to come through. There are formal Definition-environments on the nLab to serve the purpose of indicating that what follows is a formal statement. Not every line in the Idea-sections and the Related-entries section can be or even should be formal like that. Before the reader new to the topic should know all the differences between proof nets and string diagrams, I think the priority is to first highlight the so very cool fact that these apparently independent developments are in fact essentially the same. Up to some difference that is rather more attributable to the history of humans conceiving of the concept,than of the platonic concept itself.

    But, sure, we should also have detailed discussion, as you already started to provide, of all that fine detail that makes the concepts be a bit different.

    • CommentRowNumber39.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 7th 2014

    Okay, sounds good Urs. I’ll make a modification. If anyone knows of other places where something similar was said, I can make modifications there as well.

    I think we mostly see eye to eye, then, but let me express just one counter-feeling: that besides highlighting very cool similarities between different research areas, we should probably also beware of exaggerating similarities. Not because I’m a pedant or a stickler, but because people should also be able to come to the nLab with a feeling of confidence, and exaggerations tend to erode confidence. (It would be fine IMO if one were to say something like, “Exaggerating only slightly, we could say that…” – because not everyone will be quick or knowledgeable enough to delineate the formal from the informal – the nLab is really quite a mix you know, with the formally correct not always identified as such.)

    • CommentRowNumber40.
    • CommentAuthorTobyBartels
    • CommentTimeFeb 8th 2014

    Todd, can you give an example of two propositions in linear logic that are propositionally equivalent but not Lambek-equivalent?

  1. You have AAAA&AA\oplus A \dashv\vdash A \dashv\vdash A \& A for any AA, but in general these are not isomorphisms.

    • CommentRowNumber42.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 8th 2014

    Noam just gave two. If you want just MLL, then one has ((A1)1)1A1((A \multimap 1) \multimap 1)\multimap 1 \vdash A\multimap 1 and A1((A1)1)1A\multimap 1 \vdash ((A \multimap 1)\multimap 1)\multimap 1 (the first is interpreted as the dual into 11 of a double dual embedding; the second as a double dual embedding applied to a dual A1A\multimap 1) whose composite is not the identity on the triple dual.

    • CommentRowNumber43.
    • CommentAuthorUrs
    • CommentTimeFeb 8th 2014
    • (edited Feb 8th 2014)

    Todd, suppose Melliès were to put the following paragraph from the introduction of his article (pdf) into the nnLab. Which bits of it would you criticize for being misleadingly imprecise?

    My interest was reinforced by a discussion with Yves Lafont, who revealed to me that multiplicative proof-nets, and more generally, his own notion of interaction nets [32] are specific instances of a graphical notation invented by Roger Penrose [44, 45] to manipulate morphisms in monoidal categories; and that this notation is itself connected to the works by Jean Bénabou on bicategories [4], by Ross Street on computads [49], and by Albert Burroni on polygraphs and higher-dimensional rewriting [13]. Moreover, André Joyal and Ross Street published at about the same time two remarkable papers [26, 27] devoted to braided monoidal categories and string diagrams.

    Since multiplicative proof-nets are instances of string diagrams…

    He goes on to say he wants to talk about the non-multiplicative fragment, too. But by what “we” (it was Mike in revision 8, see the last paragraph of the greenish bits) have at string diagram, “we” were already happy to call that more general notion “string diagram”, too.

    • CommentRowNumber44.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 8th 2014
    • (edited Feb 8th 2014)

    I would simply ask him to explain what he means exactly. I said in #37 that I wasn’t clear on what this should mean exactly. But I’m not saying there are insurmountable problems.

    Okay… now that I’ve had a little time to reflect on this, I’d say that probably the tricky or potentially misleading part is that there are two tensor products \otimes and \parr that have to interact in a controlled way, and the fact of the matter is that this implies proof nets are not composed in the same way that string diagrams (which involve one monoidal product) are composed. The difference is that proof nets have to be hooked together one string at a time (see the specific form of the cut rule in linear logic). IIRC, this is actually explored in some detail in the paper by Blute, Cockett, Seely, Trimble. So, you could say proof nets ’are’ string diagrams, but with a really major caveat.

    So sorry I didn’t mention this before. My only excuse is that this stuff was really current in my head about 20 years ago, and it takes a little while in some cases to recall what I should have mentioned to you earlier. I can make the mathematical point above more explicit in the various nLab articles that mention “proof nets \leftrightarrow string diagrams”, but I’d be obliged if you could tell me which articles those are (or not – I can probably find them all myself in time, no biggie).

    Edit: I think that my slowness is also accounted for by the fact that I usually don’t think in terms of two monoidal products \otimes and \parr, but in terms of \otimes and \multimap, which I find more natural and intuitive.

    • CommentRowNumber45.
    • CommentAuthorUrs
    • CommentTimeFeb 8th 2014
    • (edited Feb 8th 2014)

    Regarding the two tensor products: is that not taken care of just by realizing that the “full fragment” of linear logic is interpreted in what are actually bi-monoidal categories?

    There are plain monoidal categories, and string diagrams for them. Then there are monoidal categories with various extra structures (closed structure, an endofunctor, a second tensor product etc. pp.). It seems to be common practice, reflected in the old nnLab entry on string diagrams, to call the graphical syntax for all these cases still “string diagrams”, even if that extra structure leads to the “strings” themselves being accompanied by further decorations or by boxes, etc.

    Is that part of your complaint? Do you suggest one should not speak of “string diagram” anymore in these more general cases?

    Regarding editing the nLab entries: I’d think that instead of adding a detailed technical discussion to each and every entry which mentions the close relation between proof nets and string diagrams, there should be one single such discussion in the entry on proof nets, in a section “Properties – Relation to string diagrams”.

    This way, any reader being pointed to that relation, either now or in the future from other entries which may not even exist yet, will surf to proof net and find it all explained there.

    • CommentRowNumber46.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 8th 2014
    • (edited Feb 8th 2014)

    Bimonoidal categories meaning rig categories? Rig categories are a bit different (the distributive law is non-linear).

    There is a relevant fragment of MLL called “weakly distributive categories”, originally introduced by Cockett and Seely, which has the two products \otimes and \parr (but no ¬\neg in the definition). Here ordinary distributivity (of conjunction over disjunction) is weakened to a mere monoidal strength

    (AB)CA(BC)(A \parr B) \otimes C \to A \parr (B \otimes C)

    The paper by Blute, Cockett, Seely, Trimble explores proof nets for this fragment and for full MLL.

    by further decorations or by boxes

    Not sure which boxes you meant here. We had recently been talking about boxes for !!. I just took a look at string diagram to see what was said there exactly that would be more immediately relevant to what we’re talking about now with \otimes and \parr (not a whole lot). There is some reference to material on John Baez’s page, and I seem to remember some probably related discussion at the Café some years ago which I should probably revisit, but which involved \otimes and \multimap I guess, and may have involved Mike Stay. There was also a reference to a paper by François Lamarche on “essential nets” which may or may not be immediately relevant to what we’re discussing now; I’d have to check to be sure. The phrase “string diagram” does not occur in his paper.

    Anyway, to answer your question

    Do you suggest one should not speak of “string diagram” anymore in these more general cases?

    my current position is that it might be okay to do this, but it’s potentially pretty confusing, because these matters are quite sensitive to details of presentation. In another words, there are several possibilities for how one might ’translate’ proof nets into the language of string diagrams (ah! maybe that’s what you meant about semantics in string diagrams before) – one might use \parr and \otimes as ’primitive’, or \multimap and \otimes as primitive, and just how one handles the details are a little bit different, depending. In one of these approaches, maybe we compose one string at a time; in other the only primitive monoidal product in sight is \otimes and there it’s okay to compose multiple strings at a time, but bits of the diagram have to be cordoned off in some way, and in none of these approaches is the translation particularly straightforward.

    So to respond to you more definitely: while it would be my preference to make a distinction between proof net and string diagram (because I think it would be a clearer that way), I would be okay with referring to proof nets as “string diagrams” of a sort as you would prefer, provided that the sense is made definite. Of course, others should feel free to chime in on this as well.

    there should be one single such discussion

    Yep, that makes sense to me too.

    • CommentRowNumber47.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 8th 2014

    Okay, I’m looking again at the Baez reference. I think in addition to the course notes in the quantum computation seminar (it’s Week 5 that is most immediately relevant), there is also the Rosetta Stone paper itself, where diagrams for closed symmetric monoidal categories are given in terms of suitable string diagrams with clasps and bubbles (pages 30-33). I think it’s fair to say that the presentations in both those places are pretty sketchy, but my strong feeling is that it all definitely ’works’, and that one can write down the precise rules with a little effort. So tentatively, I can live with making a statement that proof nets, at least for smc categories and probably beyond, can be translated into the language of string diagrams with bubbles, and maybe everyone’s happy. How does that sound to you?

    I’m not clear yet on whether this formalism matches the formalism of Lamarche; I’d have to look more closely at his paper. Is this something you’ve already looked at closely?

    • CommentRowNumber48.
    • CommentAuthorTobyBartels
    • CommentTimeFeb 9th 2014

    Thanks for the example, Noam!

    Todd, it is a bit of a coincidence that you can write ‘It should be observed that all equivalences ABA \equiv B listed below are in fact Lambek equivalences.’, since I might well have listed the idempotence of the additive connectors but didn't.

    • CommentRowNumber49.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 9th 2014

    Maybe your taste in equivalences is instinctively good, Toby. :-)

  2. Re: #43 and #44, I don’t want to put words into Paul-André’s mouth, but he is well aware that proof nets are not quite the same thing as string diagrams. As I understand it, his aim is rather to highlight the similarities in order to better understand the differences. That’s how I read this passage for example:

    My ambition in writing this elementary tutorial is to demonstrate in a few pictures that categorical semantics is also of a diagrammatic nature. Proof-nets were invented by a genial mind, but they remain an ad’hoc and slightly autarchic artefact of proof-theory. On the other hand, string diagrams flourished in the middle of algebra. Categorical semantics is precisely here to connect the two subjects, with benefits on both sides: in logic and computer science, as well as in categorical algebra.

    With that said, a big part of Mellies’ research is precisely about how to get two monoidal products to “interact in a controlled way”, which also involves moving beyond the approach of classical linear logic. If you want to get a better understanding of his views, I recommend his more recent paper, A micrological study of negation.

    • CommentRowNumber51.
    • CommentAuthorUrs
    • CommentTimeFeb 9th 2014

    Hm, That passage seems to say that string diagrams are categorical semantics for proof nets..

  3. I’m not sure where you read that, and I doubt he would put it that way. Here’s a passage from the introduction to his “Gallery” which perhaps gets the point across better.

    One main ambition of my work is to understand how formal proofs should be written down. This simple question is related to a myriad of fascinating problems at the interface of proof theory, games semantics, n-dimensional algebra, and mathematical physics.

    For instance, I like to think that « game semantics » is not just another « semantics » for proofs. I prefer to see it as a « syntax ». Of course, this « game syntax » is a new kind of syntax, which does not look like the previous ones – especially if one thinks of Frege’s ideography, Gentzen’s sequent calculus, or even Girard’s proof-nets.

    This new brand of diagrammatic syntax is at the same time a « combinatorics » of proofs, akin to the familiar structure of topological knots in mathematics. The wonderful thing is that we have at our disposal, today, a large amount of tools coming from algebra and mathematical physics, to analyze that kind of diagrammatic situation.

    • CommentRowNumber53.
    • CommentAuthorUrs
    • CommentTimeFeb 9th 2014
    • (edited Feb 9th 2014)

    Todd, re #47

    How does that sound to you?

    Sounds good to me. Only that I’ll tend to allow myself to drop the “with bubbles” from “string diagrams with bubbles”.

    I had tried to ask this in #45 already and I am not sure if your #46 was meant as the answer: I was wondering if all the discussion we had was really just about specifying fragments. You seemed to have said in #44 that a (or the?) difference betwee proof nets over string diagrams is that proof nets have a second tensor product. But to me the graphical calculus for a monoidal category that is equipped with a second tensor product (and possibly further structure, too) is still a “string diagram”. (And not just to me, it seems when I look around in the literature. Naturally so, I would say, because even with all decorations and labels and boxes and clasps and whatnot, at the core there are strings being drawn.)

    But the same attitude that makes one keep in mind different “fragments” of the theory of monoidal categories when speaking about string diagrams is at work when we say “linear logic” and keep in mind that this need not necessarily mean Girard’s original full set of axioms, but might just mean a small fragment of it. In fact you were the one to stress this to me a few weeks back.

    This understood (and it would seem to me that this is uncontroversial and common understanding, but correct me if I am wrong) then if there is a difference between proof nets and string diagrams that is in the concept instead of just in the convention of bringing it to paper, then – hope that’s not a shock now – I haven’t grasped that yet.

    Imagine we make contact with an alien life form. Turns out they are doing maths, but it looks really different. They use flashing lights for definitions and dance their proofs. But while it looks really different, the concepts of maths don’t vary through the physical universe, and after a while we figure out that when they dance in circles then they are doing linear algebra. We’d be well advised to conclude that “dancing in circles this way IS linear algebra” instead of declaring that there are two different fields of mathematics here. Instead there are two different ways of expressing a single concept.

    That’s what seems to me to be happening with linear logic and its proof nets on the one hand and monoidal category theory and its string diagrams on the other. It’s part of that big great insight which you recently pointed out to me goes way to Lambek and his student Szabo, “computational trinitarianism” if one likes. It’s a great thing, and that’s why Melliès draws plenty of triangular diagrams in his introduction to highlight how this “trinitarianism” is a work. In fact in his introduction he attributes this trinitarianism to

    *Joachim Lambek, Phil Scott, “Introduction to Higher Order Categorical Logic”, Cambridge Studies in Advanced Mathematics Vol. 7. Cambridge University Press, 1986.

    Anyway, you know this – you taught me much of what goes into this. But this to answer your question above, as to how this sounds to me: this sounds good to me! :-)

    • CommentRowNumber54.
    • CommentAuthorUrs
    • CommentTimeFeb 9th 2014
    • (edited Feb 9th 2014)

    Noam, regarding

    I’m not sure where you read that,

    In the quote that you provide in #50. How is this not what it says?

    But I’ll not further followup on this, and if you disagree I’ll just leave it at that – not that we start sounding like theologicians fighting over text exegesis. I am concerned about this to the extent that Todd complains about edits I make on the nnLab, but now that we seem to have that sorted out, I think I need to turn my attention to other issues.

  4. Urs, you are right that this is kind of theological and not really worth debating, but just to clarify, if by “string diagrams are categorical semantics for proof nets” you mean

    String diagrams and proof nets are roughly the same ’idea’, namely, a diagrammatic language for morphisms/proofs. This connection is useful because string diagrams have a well-understood categorical semantics in monoidal categories—although conversely there are also some aspects of proof nets/linear logic which don’t quite translate to traditional string diagrams, and it is an interesting question of how to properly account for them.

    …then I think that you and Melliès are in agreement! (This is also what I understood as Todd’s position.)

    • CommentRowNumber56.
    • CommentAuthorTodd_Trimble
    • CommentTimeFeb 9th 2014

    Okay, Urs (#53) , glad it sounds good to you! I’ll get to work on it in a minute.

    Well, to sum up what seems to be your position colloquially, “if it looks like a string diagram, and quacks like a string diagram, then…” But, to me, it’s not a question of “fragments” (by the way, taking a fragment of monoidal category theory per se sounds like centipede mathematics – not much meat there to begin with).

    The actual ways in which one manipulates stringy-looking diagrams can differ quite radically. The context I have in mind right now is the one from BCST (Blute-Cockett-Seely-Trimble) where there are two tensor products connected by a weak distributive law. Now the diagrams from that paper sure look an awful lot like traditional string diagrams, and has some very nice features: the par switches and \otimes switches are even treated something like the constituents of a tensor scheme/computad! That I find very nice indeed. The all-important difference though is in the rules for composition. They are simply not composed as string diagrams are. So they really do look like string diagrams, but operationally they’re really not the same.

    The dancing aliens analogy reminded me of how there are books out there that like to make analogies between say quantum field theory and Buddhist philosophy (The Tao of Physics, The Dancing Wu Li Masters, etc.). When I was younger, my mom, knowing of my twin interests in yoga, Buddhism, etc. on one hand and mathematics and science on the other, would sometimes buy that kind of book for me for my birthday. And I used to enjoy reading the kinds of unifying visions on offer there. But I suppose the intellectual pitfalls of this sort of thinking are clear enough. (Also there is often an implicit suggestion in books by Fritjof Capra and the like that if only we westerners had been open sooner to the embracing holistic wisdom of Taoism, etc., that we would have seen or arrived at the insights of modern physics much sooner – that on some level they already knew all that stuff.)

    Well, let’s not bicker about this any more. You have too much to do, Urs – and frankly I am (as is probably anyone who pays attention to the nLab) in awe of all that you do here: both your powers of unification (not of the Capra variety obviously!) and your speed of production. I can understand why you might be annoyed by my taking up your time at the nForum with relatively piddling stuff. So, let me instead get back to work, and do my own little part here. :-)

    • CommentRowNumber57.
    • CommentAuthorDavid_Corfield
    • CommentTimeFeb 10th 2014

    the so very cool fact that these apparently independent developments are in fact essentially the same. Up to some difference that is rather more attributable to the history of humans conceiving of the concept, than of the platonic concept itself.

    Go with Hegel and you can close that gap. The concept itself is not a static thing, but a developing thing, along some logically unfolding path. This path is enacted in the course of human history, albeit in a less that direct way. Lakatos at his most Hegelian wrote

    The Hegelian language, which I use here, would I think, generally be capable of describing the various developments in mathematics. (It has, however, its dangers as well as its attractions.) The Hegelian conception of heuristic which underlies the language is roughly this. Mathematical activity is human activity. Certain aspects of this activity - as of any human activity - can be studied by psychology, others by history. Heuristic is not primarily interested in these aspects. But mathematical activity produces mathematics. Mathematics, this product of human activity, ‘alienates itself’ from the human activity which has been producing it. It becomes a living, growing organism, that acquires a certain autonomy from the activity which has produced it; it develops its own autonomous laws of growth, its own dialectic. The genuine creative mathematician is just a personification, an incarnation of these laws which can only realise themselves in human action. Their incarnation, however, is rarely perfect. The activity of human mathematicians, as it appears in history, is only a fumbling realisation of the wonderful dialectic of mathematical ideas. But any mathematician, if he has talent, spark, genius, communicates with, feels the sweep of, and obeys this dialectic of ideas.

    Now heuristic is concerned with the autonomous dialectic of mathematics and not with its history, though it can study its subject only through the study of history and through the rational reconstruction of history. (Lakatos, Proofs and Refutations 1976, 145-6)

    • CommentRowNumber58.
    • CommentAuthorUrs
    • CommentTimeJul 12th 2014
    • (edited Jul 12th 2014)

    As you may have seen, Daniel Murfet has a new article on linear logic

    • Daniel Murfet, Logic and linear algebra: an introduction (arXiv:1407.2650)

    As the title says, it’s an introduction with maybe little news for the experts, but it does contain comments on some more modern concepts that one rarely sees in other texts on linear logic, such as for instance derived categories.

    Indeed, Daniel Murfet has a background in topological quantum field theory, Landau-Ginzburg models and stuff. By coincidence he had come out last February with a suggestion relating that to linear logic in arXiv:1402.4541 around the same time when I had written “Quantization via linear homotopy types”, and that’s how we got into contact about this. (In his latest article he remarks that in my text a discussion of cut elimination is missing.)

    In this spirit, it seems, is also the plan of a conference at ESI next year “Higher Topological Quantum Field Theory and Categorical Quantum Mechanics” (announcement pdf).

    Anyway. I haven’t yet had time to read the article in full. But I added pointers to it to the References at linear logic and at geometry of interaction.