Good news: I finished a drafty set of rules for $\otimes$, $1$, and $\multimap$ with tensor projections. Don’t set your hopes too high; it’s kind of a mess. Now I just have to figure out how to typeset this on a new page…

]]>Right. I was thinking about whether it’d be convenient to make it not sugar. Not convenient enough for me.

The $(x_{(1)},x_{(3)})$ could be sugar for (the same thing as) $(x_{(1)},x_{(3)} | \epsilon(x_{(2)}))$. Sugar doesn’t need to have good metatheoretical properties.

]]>Yes, but the latter is just sugar for $(x_{(1)(1)},x_{(1)(2)},x_{(2)})$ or $(x_{(1)},x_{(2)(1)},x_{(2)(2)})$, and I also didn’t try to make that desugaring precise.

]]>Odd: if $C$ is a comonoid, in the context $x:C$, you’d allow $(x_{(1)},x_{(2)})$ and $(x_{(1)},x_{(2)},x_{(3)})$ but not $(x_{(1)},x_{(3)})$, right? Makes type checking yet more complicated. I’m not going to handle that.

]]>As I always say to my calculus students, the more you practice it, the better you’ll get at it, and the more you’ll like it.

]]>I see it clearly now. I was guessing they were equal, I just wasn’t sure. You see? This is why I don’t like category theory much. I’m bad at it. :)

I’m going to put off the additive connectives at first. I have a plan, but the bookkeeping is more complicated, so I should work out the details of MILL first.

]]>By definition, $[g_1\circ f, g_2\circ f]$ is the unique morphism that produces $g_1\circ f$ and $g_2\circ f$ when composed with the two projections $\pi_1$ and $\pi_2$ respectively. But $[g_1,g_2]\circ f$ also has this property, since $[g_1,g_2]$ produces (by definition) $g_1$ and $g_2$ when composed with $\pi_1$ and $\pi_2$ respectively, and composition is associative. Therefore, they are equal.

]]>Probably? If I knew I wouldn’t have to ask. This is $[g_1,g_2] \circ f$ versus $[g_1 \circ f,g_2 \circ f]$. Where brackets are pairing for $\&$, $f:A \to B_1 \otimes B_2$, yada yada. I know they’re the same in a cartesian category where $\otimes = \& = \times$, but I don’t know why, in categorical terms.

So you’re saying they should be equal. OK.

]]>Maybe I’m misunderstanding your notation, but doesn’t that equality simply follow from the naturality of the universal property of the cartesian product?

]]>Actually, I’m not sure about the semantics of a similar example, where the generator can be inside or outside the pair:

$g_1:(B_1,B_2) \to C \qquad g_2:(B_1,B_2) \to D$

$[g_1(f_{(1)}(x),f_{(2)}(x)),g_2(f_{(1)}(x),f_{(2)}(x))]\;:\;C \& D$

Dispensing with tensor projections, this could be written in two ways:

$let\;(b_1,b_2) = f(x)\;in\;[g_1(b_1,b_2),g_2(b_1,b_2)]$

$[let\;(b_1,b_2) = f(x)\;in g_1(b_1,b_2),let\;(b_1,b_2) = f(x)\;in g_2(b_1,b_2)]$

Are these actually equal? Maybe sometimes implicit $\otimes$-elim is just plain wrong, not tricky.

]]>For your entertainment, I’ll tell you about a tricky situation I ran into involving your tensor projections and generators, combined with $\&$-intro.

For both examples,

$x:A \qquad f:A \to (B_1,B_2)$

Example 1:

$g_1:B_1 \to C \qquad g_2:B_1 \to D \qquad h:B_2 \to E$

$([g_1(f_{(1)}(x)),g_2(f_{(1)}(x))],h(f_{(2)}(x)))\;:\;(C \& D) \otimes E$

Example 2:

$g:(B_1,B_2) \to C \qquad h:A \to D$

$[g(f_{(1)}(x),f_{(2)}(x)),h(x)]\;:\;C \& D$

The problem is whether you use the generator rule on $f$ inside or outside the $\&$-intro. In example 1, you have to use it outside; in example 2, you have to use it inside the left component. But processing both examples left to right, the $f$ is first encountered in the left component, because the $\otimes$-elim is implicit.

I’m not sure if I’m going to handle this.

I guess there’s basically the same problem with $\oplus$-elim.

]]>Well, in principle both would be nice to have – one for simplicity of exposition, the other for convenience of practical use. I would say start with whichever is easier for you.

]]>To handle lollipop (internal-hom), do you want unidirectional inference with annotations on lambdas, or bidirectional typing with an optional annotation/cut rule. Do you want to require eta-long terms?

]]>Then that’s not stable under substitution.

]]>How do you distinguish implicitly between weakening a variable $x$ and weakening the value $f(x)$?

Good point. I guess by not implicitly doing anything else, like call $f$. So you’d still need explicit counit.

In other words $x:C \vdash () = (| \epsilon(x)) \neq (| \epsilon(f(x)))\;:\;()$.

]]>I think that semantically, having unnatural contraction and weakening is equivalent to saying that every type is equipped with a specified cocommutative comultiplication. So there’s some overlap in the intended semantics, but my system is more flexible because it doesn’t require any or all types to have comultiplications, let alone cocommutative ones.

I can’t imagine how implicit weakening could be unnatural. How do you distinguish implicitly between weakening a variable $x$ and weakening the value $f(x)$?

I’d be rather interested if you can think of a good way to add internal-homs to my system, but I’m not planning to spend any time on such an extension myself. (I am, however, planning to run a group at the ACT 2020 summer school aiming to extend this system to monoidal bicategories.)

]]>It doesn’t have tensor projections, like your system does. The non-natural contraction seems different from implicit comultiplication because the copies aren’t distinguished. So I guess there’s not really much to learn for extending your system?

I suppose it’s a thought to have completely implicit, non-natural weakening, instead of your counit notation. Does that really work? What do you think of that style? Oh wait, that seems ambiguous with your sugar for multiple comultiplications. Or maybe not, because of the counital and coassiciative axioms!

I got distracted from extending your system. I tried looking up a ready-made solution but didn’t find it. I think I could sit down and work out something probably right, but I haven’t. I’m not that excited about it; I can’t tell if you are.

]]>I just read Hasegawa’s thesis, in which among other things he gives a multiple-conclusion type theory where both commas represent the same tensor product. (Actually he writes the right-hand commas as $\otimes$, but he considers them to be definitionally associative, so they act like a comma.) His setup allows contraction and weakening, but not naturally: semantically it’s like a linear/nonlinear adjunction but without the right adjoint. He also has a sort of multiple-domain multiple-codomain internal-hom too, although it’s somewhat different because it always lands in the nonlinear world.

]]>I think unique derivations could be regained with a different style of rules motivated by type checking. That’s what I’m planning. There’s a nice way to do linear type checking by passing a state that keeps track of which variables were used. I think it can be modified to handle tensor projections too.

]]>Hmmm… I suppose you would want to do that. Well, I guess you could try to incorporate it in the generator rule somehow… but probably that rule is about as overextended as possible already and the best thing to do would be to retreat from terms-have-unique derivations. Maybe it would work…

]]>How so?

]]>