added pointer (here and at *calculus of constructions*) to:

- Zhaohui Luo,
*Computation and Reasoning – A Type Theory for Computer Science*, Clarendon Press (1994) $[$ISBN:9780198538356, pdf$]$

fixed author name and added more hyperlinks for this item

- Robert Constable,
*The Triumph of Types: Creating a Logic of Computational Reality*, lecture at*Types, Semantics and Verification*, Oregon (2011) [pdf, ConstableTriumphOfTypes.pdf:file]

I noticed that the list of references here was lacking some glue where it jumped from Russel right to identity types. I have now filled in (here) pointer to:

- Per Martin-Löf,
*Constructive Mathematics and Computer Programming*, Studies in Logic and the Foundations of Mathematics Volume 104, 1982, Pages 153-175 (doi:10.1016/S0049-237X(09)70189-2)

But there is lots of room left for the eager type theorist to further expand and beautify the discussion of the literature in this entry…

]]>Probably, but I don’t know where.

]]>I see, thanks. That’s indeed recognizable, with some parsing effort. Did he ever state it in more streamlined form?

]]>Page 94, “I-elimination”.

]]>Where does he state it, clearly? In “*An intuitionistic theory of types: predicative part*” it appears in mere prose in Sec 1.7, and in some baroque form on p. 109.

Martin-Lof’s version isn’t recognizable?

]]>I have added pointer to:

- Bengt Nordström, Kent Petersson, Jan M. Smith, §8.1 of:
*Programming in Martin-Löf’s Type Theory*, Oxford University Press (1990) [webpage, pdf, NordstromPeterssonSmith-TypeTheory.pdf:file]

which may be the earliest reference that I have seen which actually states the J-rule in recognizable form

]]>added pointer to:

- Harley Eades,
*Type Theory and Applications*, 2012 (pdf)

OK, I put it up at tensor projection. Please take a look and tell me what you think.

]]>Good news: I finished a drafty set of rules for $\otimes$, $1$, and $\multimap$ with tensor projections. Don’t set your hopes too high; it’s kind of a mess. Now I just have to figure out how to typeset this on a new page…

]]>Right. I was thinking about whether it’d be convenient to make it not sugar. Not convenient enough for me.

The $(x_{(1)},x_{(3)})$ could be sugar for (the same thing as) $(x_{(1)},x_{(3)} | \epsilon(x_{(2)}))$. Sugar doesn’t need to have good metatheoretical properties.

]]>Yes, but the latter is just sugar for $(x_{(1)(1)},x_{(1)(2)},x_{(2)})$ or $(x_{(1)},x_{(2)(1)},x_{(2)(2)})$, and I also didn’t try to make that desugaring precise.

]]>Odd: if $C$ is a comonoid, in the context $x:C$, you’d allow $(x_{(1)},x_{(2)})$ and $(x_{(1)},x_{(2)},x_{(3)})$ but not $(x_{(1)},x_{(3)})$, right? Makes type checking yet more complicated. I’m not going to handle that.

]]>As I always say to my calculus students, the more you practice it, the better you’ll get at it, and the more you’ll like it.

]]>I see it clearly now. I was guessing they were equal, I just wasn’t sure. You see? This is why I don’t like category theory much. I’m bad at it. :)

I’m going to put off the additive connectives at first. I have a plan, but the bookkeeping is more complicated, so I should work out the details of MILL first.

]]>By definition, $[g_1\circ f, g_2\circ f]$ is the unique morphism that produces $g_1\circ f$ and $g_2\circ f$ when composed with the two projections $\pi_1$ and $\pi_2$ respectively. But $[g_1,g_2]\circ f$ also has this property, since $[g_1,g_2]$ produces (by definition) $g_1$ and $g_2$ when composed with $\pi_1$ and $\pi_2$ respectively, and composition is associative. Therefore, they are equal.

]]>Probably? If I knew I wouldn’t have to ask. This is $[g_1,g_2] \circ f$ versus $[g_1 \circ f,g_2 \circ f]$. Where brackets are pairing for $\&$, $f:A \to B_1 \otimes B_2$, yada yada. I know they’re the same in a cartesian category where $\otimes = \& = \times$, but I don’t know why, in categorical terms.

So you’re saying they should be equal. OK.

]]>Maybe I’m misunderstanding your notation, but doesn’t that equality simply follow from the naturality of the universal property of the cartesian product?

]]>Actually, I’m not sure about the semantics of a similar example, where the generator can be inside or outside the pair:

$g_1:(B_1,B_2) \to C \qquad g_2:(B_1,B_2) \to D$

$[g_1(f_{(1)}(x),f_{(2)}(x)),g_2(f_{(1)}(x),f_{(2)}(x))]\;:\;C \& D$

Dispensing with tensor projections, this could be written in two ways:

$let\;(b_1,b_2) = f(x)\;in\;[g_1(b_1,b_2),g_2(b_1,b_2)]$

$[let\;(b_1,b_2) = f(x)\;in g_1(b_1,b_2),let\;(b_1,b_2) = f(x)\;in g_2(b_1,b_2)]$

Are these actually equal? Maybe sometimes implicit $\otimes$-elim is just plain wrong, not tricky.

]]>For your entertainment, I’ll tell you about a tricky situation I ran into involving your tensor projections and generators, combined with $\&$-intro.

For both examples,

$x:A \qquad f:A \to (B_1,B_2)$

Example 1:

$g_1:B_1 \to C \qquad g_2:B_1 \to D \qquad h:B_2 \to E$

$([g_1(f_{(1)}(x)),g_2(f_{(1)}(x))],h(f_{(2)}(x)))\;:\;(C \& D) \otimes E$

Example 2:

$g:(B_1,B_2) \to C \qquad h:A \to D$

$[g(f_{(1)}(x),f_{(2)}(x)),h(x)]\;:\;C \& D$

The problem is whether you use the generator rule on $f$ inside or outside the $\&$-intro. In example 1, you have to use it outside; in example 2, you have to use it inside the left component. But processing both examples left to right, the $f$ is first encountered in the left component, because the $\otimes$-elim is implicit.

I’m not sure if I’m going to handle this.

I guess there’s basically the same problem with $\oplus$-elim.

]]>Well, in principle both would be nice to have – one for simplicity of exposition, the other for convenience of practical use. I would say start with whichever is easier for you.

]]>To handle lollipop (internal-hom), do you want unidirectional inference with annotations on lambdas, or bidirectional typing with an optional annotation/cut rule. Do you want to require eta-long terms?

]]>