Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
1 to 24 of 24
Hi there. After reading Awodey’s book (Category Theory) and getting interested in Type theory I’ve started to look at the arithmetic content I deliver to my pupil (French Highschool) in a new way. For example, we are working on gcd, lcm.After prooving Bézout’s identity, we came across what is called Gauss’s Lemma in France : $a | bc$ and $a \wedge b = 1$ yields $a|c$. From Awodey’s book, I went to interpret the divisibility relation as the logical implication, the gcd as the conjunction. But then I had to work my way towards a logical interpretation of the Lemma. On the way, I found a categorical proof for the Lemma. I would like to share it. I couldn’t find much echo on mathoverflow. Hence I came here to look for some help, especially for the interpretation in terms of logic.
Elementary arithmetic and logic are not exactly the same thing but they share some common abstract structure.
The algebraic structure underlying classical propositional logic is the boolean algebra. If you have some set of atomic propositions $\mathcal{A}=\{a,b,c...\}$, you can build formulas using the conjonction $\wedge$, disjonction $\vee$, the negation $\neg$, the false $\bottom$, the true $\top$. Write $X(\mathcal{A})$ for the set of all these formulas. Define an equivalence relation $\sim$ on $X(\mathcal{A})$ defined by $A \sim B$ iff for every assignment $\mathcal{D}:\mathcal{A}\rightarrow \{0,1\}$ (which specify for each atom whether it is considered true or false), when you extend $\mathcal{D}$ to a function $X(\mathcal{A}) \rightarrow \{0,1\}$ using the usual meaning of the logical connector, you have that for every $\mathcal{D}$, $\mathcal{D}(A) = \mathcal{D}(B)$. Then $X(\mathcal{A})/\sim$ is a boolean algebra. A boolean algebra can be defined as a complemented distributive lattice. I let you check what is the exact definition. The structure of poset of the lattice is given by saying that $[A] \le [B]$ iff for every $\mathcal{D}$, whenever $\mathcal{D}([A])=1$, $\mathcal{D}([B])=1$.
Let’s say that elementary arithmetic is about the divisibility relation on natural numbers. Then you obtain a distributive lattice with structure of poset given by the divisibility relation, $\wedge$ is the gcd, $\vee$ the lcm, $\bottom$ is 1 and $\top$ is 0. It is not a boolean algebra, because nothing plays the role of the negation. There is a Wikipedia entry on this lattice whose title is “Division lattice”.
Now, you can’t do a lot with only this lattice because you also need also the addition and the multiplication! So you have to combine the structure of a commutative semiring (i.e. a commutative ring without negation) and of a distributive lattice. The chapter 21 of the book “Semirings and their Applications” (Golan) is about “lattice-ordered semirings” but the compatibility between the structure of semiring and lattice that he is asking for is not the one we want. I’m not sure if it has a name. It would be very interesting to have an abstract characterization of the set of natural numbers as a commutative semiring which is also a distributive lattice in a compatible way i.e. it would be nice to find all the properties that you need to add to commutative semiring + distributive lattice to be sure that you obtain the natural numbers with their usual structure of semiring and lattice.
Then you could maybe build a kind of sequent calculus where the assertions you prove are of the form $(a.b) | c$ for instance. Then you could maybe write a proof with hypotheses $a|bc$ and $a \wedge b=1$, and conclusion (root) $a|c$ in the same way that you prove $A \wedge B \vdash A$ with the sequent calculus of classical propositionnal logic (starting from no hypotheses). You could maybe combine some of the inference rules of intuitionnistic propositionnal logic with some of differential linear logic (for the multiplication and addition). I just give you my ideas. I don’t think that there is anything like an “interpretation of elementary arithmetic in terms of logic” today but you could maybe build a logical system to prove facts about elementary arithmetic in a way close to how you make proofs in classical propositionnal logic.
I hope this is not too confusing, I mentioned a lot of stuff maybe, some of which is not very well-known (like differential linear logic), some definitions about posets, the notion of the semiring, and the end consists of my ideas about how we could answer your question satisfyingly. I’m interested by logic and algebra, and I think there are interesting ideas to explore at the intersection of the two. In the same vein, differential linear logic was about combining logic and differentiation and it is a very interesting logical system. Your question is quite interesting. To answer it we could try to characterize the combination of algebraic and order-theoretic structure of the natural numbers and then build a logical system as I sketched. I don’t want to work on this right now but maybe I’ll think more about this one day.
By the way, what is your categorical proof of Gauss’s lemma?
Hi J-B. Thanks for your answer! For you to see my proof, I made a screenshot uploaded here diagrammatic proof for Gauss’ Lemma. The multiplication $bc$ is defined as the pushout of the diagram $b \leftarrow 1 \rightarrow c$. gcd and lcm fit in a commutative diagram as depicted here. I’m pretty much convinced by my proof. I’d like your advice. I’m still lacking the logical interpretation and will follow the hints you gave in your comment. For the addition $b + c$, I was looking towards the pullback of the diagram $b \rightarrow 0 \leftarrow c$. I did not yet tried to write Bézout’s identity or alike.
There is not really any logical interpretation I think… I was speaking about building a “logic” (more precisely a kind of sequent calculus) which would be specialized into proving facts in elementary arithmetic because it does not exist. Elementary arithmetic is not logic in any usual sense (If you want, you can use the theory of Peano’s arithmetic in first order classical logic but this is very old fashioned and not very category-theoretic. Moreover it is not a logic per se but a theory “one level above” the logical level. These kind of theories are just a formalization of we do mathematics classically).
Your diagrammatic proof is something very different from what I have in my head (although if my idea could be realized I would be interested to see how the “proof” of Gauss’s lemma in this system would be related to your diagrammatic proof??). And it is very nice that you found this proof! If you want to get a feeling of what I’m thinking about, you should learn first about sequent calculus.
I’m going to dig this sequent calculus idea of yours and come back soon. See you
What category do you have in mind, smeden, where the multiplication $b c$ is the pushout of $b \leftarrow 1 \rightarrow c$ as you say? Because this will not be the case in the category I thought you were working in (the preorder of natural numbers in which there is a unique morphism from m to n just in case m divides n, and otherwise no morphism). The pushout in that category will match the coproduct (as in any preorder), and will be given by $\lcm(b, c)$.
Similarly, the pullback of $b \rightarrow 0 \leftarrow c$ in that category will not be $b + c$, as you suggest. The pullback will be $\gcd(b, c)$.
I also don’t understand your description of pulling g back along f in the diagrammatic proof you propose. I do not believe you are using the terminology of pullbacks in the standard way in that argument, because g and f do not have a common codomain.
@SridharRamesh You are right I said crap about the pushout. There is no such thing. We can fit together $1 \to b \to bc$ and $1 \to c \to bc$ in a commutative square. But there is nothing like a universal property as the one a pushout would give.Indeed, in general, given a common multiple $d$ of $b$ and $c$ there will be no arrow from $bc$ to $d$ (arrow meaning “divide”). There is one only when $b \wedge c = 1$. As for the so called pullback of g along … you’re right I went to fast.
I will try to save my diagrammatic proof though since I still believe there is something meaningfull to write down.
Following J-B recommendations, I worked on a sequent calculus for the Division Lattice. Here is what I came with : intros and elims for divide, gcd and lcm should be rather right. Tell me please what you think.
For the multiplication written $\otimes$ in the picture, I’m not sure what would be the right path, particularly for the elim rule. The two rules $\otimes E1$ $\otimes E2$ seem not strong enough to allow the derivation of a result as simple as the simplification rule : $a\otimes b \to a \otimes c \vdash b \to c$ (which I need in my proof of the Gauss’ Lemma. Is it as bad as supposing the conclusion of what you want to prove to turn the simplification rule into an elim rule for $\otimes$ ? I lack experience in
the conception of sequent calculus. Need some feedback and hints.
Thx for your comments
I’m not very used to intro/elimination rules. When you use these, you’re not really doing sequent calculus but natural deduction. In sequent calculus you only have introduction rules, a right and a left one for every connector. And you have the cut rule which is interpreted by the composition in categorical semantics (ie. when you interpret a proof $\Gamma \vdash \Delta$ as a morphism $[\Gamma] \rightarrow [\Delta]$). I’m going to try to write some rules and share a picture (I’m a bit too lazy now to write it in LaTeX or more seriously I don’t have a lot of time right now!). I’m glad that you want to work on this. If we do something good, it could maybe become a nice little paper.
And I want to add: it’s ok if your diagrammatic proof is not valid. I have not really tried to understand it but anyway, your idea of thinking about the links between logic and elementary arithmetic and how to prove Gauss’s lemma in a categorical or logical way is really good in my opinion.
Right now, I’m a bit confused about how we want to interpret a sequent $\Gamma \vdash \Delta$. Do we want $a_1,...,a_n \vdash b_1,...,b_p$ to be interpreted as $a_1...a_n\,|\,b_1...b_p$ or $gcd(a_1,...,a_n)\,|\,lcm(b_1,...,b_p)$ etc… There are maybe several possibilities. Intuitively, I would go with the multiplication, because it looks simpler.
Also, I must say that if I prefer using the cut, using elimination/intro rules is probably good too and we could maybe write the potential system in the two versions.
Some more stuff: I’m wondering if the poset $(\mathbb{N},|)$ is a symmetric (strict) monoidal category with tensor product given by the product of natural numbers, monoidal unit the number 1. The symmetry is the identity. The lcm is a coproduct, the gcd a product, 1 is the initial object, 0 is the final object. So it looks like we’re in a (Multiplicative additive) Linear Logic situation: a symmetric monoidal category with finite products and coproducts. We would have $A \otimes B = A \parr B$, but now I’m thinking about how a negation could make sense. Maybe we could interpret $neg(A)$ as the inverse of $A$… In this case, it would be a logic of nonzero rational scalars. And the divisibility would still make sense. You just have to say that $\frac{a}{b}$ divides $\frac{c}{d}$ if the prime factorization $p_1...p_n r_1^{-1}...r_p^{-1}$ of $\frac{a}{b}$ appears in the prime factorization of $\frac{c}{d}$…
I agree with $\otimes$ as a multiplication. Now, I think, that we should interpret $a_1,...,a_n \vdash b_1,...,b_p$ as $a_1 \otimes ... \otimes a_n \vdash b_1 \parr ... \parr b_p$, where $\otimes$ and $\parr$ are both the multiplication, but the negation $A^\perp$ is interpreted as $A^{-1}$. The De Morgan law then makes sense: $A \parr B$ ie. $AB$ is the same than $(A^\perp \otimes B^\perp)^{\perp}$ i.e. $(A^{-1}B^{-1})^{-1}$ (Sorry sometimes I use upper-case letter and sometimes lower-case letters but it is the same. Just that upper-case letters look more like logic and lower-case letters look more like arithmetics, so I’m not sure which ones to use.)
Ok, now I think that we have just discovered an arithmetic interpretation of Multiplicative Additive Linear Logic. So we can start with these rules, verify that they are valid and maybe add new ones. Contrary to what I thought, it is not differential linear logic because in differential linear logic, the product is equal to the coproduct, and we can sum proofs. But you can’t sum $a | b$ and $c | d$. It doesn’t imply $(a+c)|(b+d)$…
Here are the rules of Multiplicative Linear Logic (MLL): rules without the cut rule, cut rule. (I’m extracting them from what I wrote on the nlab in the entry differential linear logic)
To obtain Multiplicative Additive Linear Logic (MALL), you must add these rules (for the products and coproducts, I extracted them from the Wikipedia entry on Sequent calculus), and also the rule for the terminal object and the initial object (I add them later).
But the additives are not like in linear logic argh, because the additive De Morgan rule is not verified. $(A^\perp \wedge B^\perp)^\perp \neq A \vee B$…
The Gauss’s lemma should be a proof with the following two hypotheses and the following conclusion: type of a proof of Gauss’s lemma
I realized something really cool ahah. I think our category of proofs would be compact closed, because $a \otimes b = a \parr b$. It follows that is a traced symmetric monoidal category and the trace which start with a proof (i.e. a morphism from to) of $a \otimes x \vdash b \otimes x$ (and remember that $\vdash$ and $|$ are the same thing) and gives a proof of $a \vdash b$ is the simplification by $x$ on each side! Your simplification rule is a trace ahah. But I must tell you that I interpret $\vdash$ as the divisibility and I don’t use $\rightarrow$ so the proof of simplification would be of this type for me.
Now, I’m trying to see what more inference rules than the ones of MALL we need to write a proof of Gauss’s lemma mmh. I try to reproduce the proof from Wikipedia and it’s very challenging ahah. We’re going to need quite a bit of inference rules ahah. (It should not be too much of a surprise because arithmetic is something more complicated than classical or intuitionnistic propositional logic for instance)
I’m sorry but I should not spend too much time on this. You can send me a mail at jeanbaptiste [dot] vienney [at] gmail [dot] com. We could speak more about this from May. Now I must work on my courses at the beginning of the PhD. But I think all this looks fascinating and we will work on this and write a paper. Something with a title like “A sequent calculus for elementary arithmetic”.
I will need a little while to process all the data you ’ve just provided… But it looks promising! I’m in for a paper if we succed to build the thing. See you
Btw1 : do you know of a way to exchange more freely around, share bits of latex diagrams. I was thinking to myself we might start a/ a discord sever ? ; b/ a github ? Btw2 : would it be nice to formalize the things we are in using any proof assistant?
And the data I’m providing is not even very clear, I’m sorry… I can help you to learn the needed background and let you try to build the thing by yourself if you want by telling you: read that, verify that, try that etc… if you want.
1: Yes, sure. We can make a project on Overleaf. It’s perfect to write a LaTeX document with several people. We can talk on the Zulip Category Theory Server. It’s perfect to chat, including LaTeX in the messages. I just need your email address to invite you. By the way, we could speak French on Zulip :)
2: I’ve never used a proof assistant… And formalizing facts about a sequent calculus (like cut elimination) looks challenging… It is not like if you’re working in a very classical setting, like say topological spaces, or finite groups. So I think it would be a bit of a nightmare. But, if you wanted to do it, you could probably inspire yourself from the following paper: Formalized meta-theory of sequent calculi for linear logics.
Jean-Baptiste, I am very interested in your idea of a sequent calculus for elementary arithmetic! However, I think it is a bit unnatural to consider multiplicative linear logic as a base when you give the same semantics to $\otimes$ and $\parr$.
Perhaps $a_1,\dots,a_n\vdash b_1,\dots,b_n$ should be interpreted as $\mathrm{gcd}(a_1,\dots,a_n)\mid\mathrm{lcd}(b_1,\dots,b_n)$. Note that $(\mathbf N, \mathrm{gcd}, 1)$ is a symmetric monoidal category.
I would be happy to collaborate with you all on this topic if you are interested (I am French too ☺).
Ok, you can collaborate with us :) . I just need your email address and I will invite you on Zulip and Overleaf.
I’m not sure if it would be exactly a model of multiplicative linear logic. But I think that the poset $(\mathbb{N},|)$ is:
So it is almost a model of MALL, modulo that we don’t have a negation.
Yeah, $(\mathbb{N},gcd,1)$ is a symmetric monoidal category but only because gcd is a product and 1 an initial object, so it is a cartesian monoidal category.
Awesome, victor.sannier [at] caramail.fr
Done, I invited you both on Zulip and Overleaf.
smeden, you can send me an email or write your email address here whenever you see this and I’ll invite you as well.
my mail stefmeden[at]gmail.com btw, do you have a good ref for a refresh on linear logic? (I’m reading Di Cosmo’s Linear Logic Primer for the moment). It’s really funny. On one hand I’m telling me: “This Gauss Lemma is so trivial, you can express it in set theory…” but then, arithmetic allows to open the box, like: Gauss Lemma is obtained from Bézout’s Theorem ($\exists (u,v), au+bv = 1 \Rightarrow a \wedge b = 1$. So addition comes into play. How ? Maybe just by saying that $\neg (b | a)$ if the remainder of the euclidean division of $a$ by $b$ is not 0, you have to add something to multiples of $b$ to jump to $a$… So I guess, we have things to dig, yes.
I would recommend “Categorical semantics of linear logic” by Paul-André Melliès. I have no idea how to understand addition in this context… I have a similar feeling than you. It’s surprising how you’re quickly in unknown waters when you try to completely forget about set theory and works only with logic/category theory/order theory/algebra. I think eliminating (completely) set theory in any subject makes wonders. In my mind, translating a subject in sequent calculus is one way of realizing this. The issue is that mathematicians have been saying “Abstract nonsense!” for a long time whenever you were not trying to prove something which looks very difficult. Today, computer science plays a big role in putting us on the right path I think.
1 to 24 of 24