Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
In light of Mike and Dan Licata’s paper perhaps we should have a page on ’adjoint logic’
I wonder if there’s a way of getting at temporal logic without the need for a directed type, . I mean something temporal seems to happen when you have two adjunctions between four operators, which are dual pairs in some sense. So if and are ’dual’ in the sense, , and likewise and . And if also, and , then they can be interpreted as:
: “ will be true at some Future time”,
: “ is always Going to be true”,
: “ was true at some Past time” and
: “ Has always been true”.
It is claimed here that
the author showed that an adjoint pair of modal operators gives rise to a temporal logic.
Of course, I’d be wanting to think about general types not just propositions, and then avoiding reliance on , just as our version of possibility and necessity doesn’t rely on negation.
“Future Semiconditionally Modified Subinverted Plagal Past Subjunctive Intentional”
Sorry…
If we have an internal category , then on the slice category there are two induced monad/comonad adjoint pairs and , whose algebras/coalgebras are contravariant and covariant internal diagrams on , respectively.
I think that’s an excellent point regarding modalities, the combination of David’s question and Mike’s reply.
For bystanders, if any, I’lljust make explicit what Mike left implicit and mention that to make this come out as intuitively expected, then we’d take the internal category “” to be an internal poset. It’s objects would then be interpreted as the instants of time, its morphisms as the possible translations to the future, and an “internal diagram” over this would hence be thought of as an assignment of a type of states to each instant and a map from past states to future states for each time translation from one instant to the next.
Somebody should turn this into an nLab entry…
So if the two arrows mark the beginning, , and end, , of an interval, we base change along and then take dependent product and sum along , and vice versa?
Let’s see, because
That by itself is the “primary integral transform”, if you wish, through the graph underlying the category . Then the composition operation in the category gives the (co-)monad structure.
Right, and means “there is some interval beginning now and such that is true at its end”, i.e. ; while means “for all intervals ending now, is true at their beginning”, i.e. .
I’ll add some of this to an entry for ’temporal logic’. Just to be clear, what needs to be said about ? It’s not only an internal category, but an internal poset?
In more abstract terms, whenever we have two adjoint triples (aka adjoints between adjoints) between the same two categories, and , we’ll have and . So that’s already generating a modal logic of sorts? Then adding detail to the two triples brings me closer to temporal logic? Also to consider are conditions on time, such as branching futures, dense orderings, unboundedness, and so on.
Naturally, I’d want to push the full type picture, i.e., see it working not just for propositions. There’s something importantly radical, it seems to me, in the ’propositions as some types’ story. Philosophy almost always focuses on propositions. As you may have noticed, I like to take every opportunity to think about what happens in the general situation.
I added this section to temporal logic. I guess we needn’t have our (better) treatments down the page of what others have written, such as at necessity and possibility.
Aside from queries in #8, I didn’t know what to say about the point Urs makes in #6.
It’s not only an internal category, but an internal poset?
I guess. Parallel unequal arrows would mean there would be more than one “time interval” with the same endpoints — not exactly the way we usually think about time. (-:
In more abstract terms, whenever we have two adjoint triples (aka adjoints between adjoints) between the same two categories, and , we’ll have and .
Interesting! Yes, that’s true. Of course in that situation we also have the more basic adjoint modalities and . The meaning of those temporally is less clear – maybe they degenerate on mere propositions?
And there’s also the identity-assigning map , generating another adjoint triple in the other direction…
Interesting you mention as a chain of thought had led me to Urs’s monads/comonads and constant paths. Let’s see how it went.
I was thinking it’s odd that despite ’Always’ and ’Necessarily’ being so similar, they’re seemingly produced rather differently. The former by base change and dependent product in , the latter by dependent product then base change for (or in what we called the local case).
But then of course we can make the latter more like the former by thinking of the two projections: , the other way of rendering an equivalence relation.
So then, given the parallel between possibility/necessity and the jet comonad structure ( shouldn’t we have a similar rewriting of the latter. I.e., instead of why not , where is the infinitesimal neighborhood of the diagonal of ?
Returning to time, given we can give it extra features in the form , such as being a poset, isn’t there an advantage to doing things this way. I mean presumably there’s no that will do.
So gives rise to the triple (for predicates): on diagonal, false otherwise on diagonal, true otherwise.
Perhaps that doesn’t generate any interesting new monads/comonads as it would involve composing two dependent sums, two base changes or two dependent products.
Acccording to the SEP: temporal logic article, two typical approaches are the ’instant based’ and the ’interval based’ models of time.
Balbiani, P., Goranko, V. and Sciavicco, G., 2011, “Two-sorted Point-Interval Temporal logics”, in Proc. of the 7th International Workshop on Methods for Modalities (M4M7) (Electronic Notes in Theoretical Computer Science: Volume 278), pp. 31–45 (pdf
is given as an example of treating both at once.
With time linearly ordered, it seems there are 13 different relations possible between intervals, these can be used to general different modalities.
It seems, according to SEP, also worth expressing ’since’ and ’until’
: “ has been true since a time when was true”.
: “ will be true until a time when is true”.
Do we have the resources? For ’since’, picks out intervals ending now where held at the beginning. So we need that for one of these, all subintervals with the same beginning has hold at the end (or all subintervals with the same end has hold at the beginning).
So we need that for one of these, all subintervals with the same beginning has hold at the end
All subintervals with the same beginning, or just all subintervals with end in the past? I.e. does “since” make a claim about the future too?
I.e. does “since” make a claim about the future too?
There may well be different ’since’s if we look closely. I guess in the case
I’ve been unhappy since you left me
expressed at time , this means that there’s an earlier time , or interval , where you left me at , and I have been unhappy throughout the interval. This last condition might be translated as all subintervals have ends where I’m unhappy.
I don’t see that I’m committing myself to any claim about my future happiness, though rather odd if on uttering the sentence I suddenly become happy. But who knows the powerful effect of articulating one’s emotions.
Or is your thought that it just needs one interval throughout which I’m unhappy, in the past when I’m left, and in the future? But then this has a subinterval curtailed on the right to now.
Ah, I think this is where we have to use the fact that is a category, so that in addition to the two projections we have composition . Now can be
(note ). That is, there is a subinterval ending now such that was true at its beginning and was true at all points inside it.
We could have this temporal logic business cracked in a couple of weeks!
Here is a quick stub entry: adjoint logic.
I added the contents of #18, and its dual ’until’.
But perhaps we also use ’until’ in the sense where the condition may not happen, e.g., ’You are grounded until you say sorry’.
On another theme, I wonder how far we can go to specifying the features of time that logicians have considered, such as density, continuity, (non)branching. Density, expressed as puts me in mind of Freyd’s construction of the real interval as a terminal coalgebra.
(There’s probably some work to do in tidying up whether intervals can be points, whether the future includes the present, whether holds now if holds now, etc.)
In the current setup, with a category (so in particular having identity maps, i.e. reflexivity in the case of a poset) I think it follows that intervals can be points, the future does include the present, and holds now if and both hold now (in general, as defined above it requires to still hold at the instant when becomes true). But if we want to change that, we could let be the -intervals instead of the -ones.
It probably does make sense to let be the -intervals. Otherwise could be true now and at no other instant but we’d have which is supposed to be “ will be true at some Future time”.
But then there are no identity morphisms in the category, right?
Not sure this is helpful, but the adverb ’now’ seems like it should be involved with identity morphisms.
I was suddenly reminded of the issue alluded to at FQFT
In basic quantum mechanics one also demands that
(While this looks like the most innocent condition, this has technical subtleties for genuine quantum field theory, to deal with which however there exist established tools.)
Where’s that spelled out? Something about cobordisms and cylinders of different lengths, wasn’t it? So an identity cobordism should lead to a identity linear operator between states, yet we need time evolution depending on length. (Or something like that.)
This is the issue be dealt with here:
Segal’s definition of a conformal field theory discusses “functors” for which the domain is not a category. There is an associative notion of composition for arrows, but there are no identities.
Does the adverb “now” actually mean anything in temporal logic? I mean, does “ holds now” mean anything different from just “”?
So if we use the -intervals, then yes, we have only a semicategory, which in particular implies that our “modalities” are not monad/comonads since they lack units/counits. Which is maybe not a huge problem, but it means we can’t axiomatize them in type theory the way we have for all other modalities and comodalities — although we couldn’t do that anyway if we’re not propositionally truncating, since then they wouldn’t be idempotent. I guess if we encode them by breaking them down into ’s and s and s using adjoint logic, that wouldn’t be a problem.
Does one generally have, for instance, in temporal logic?
Does one generally have, for instance, in temporal logic?
Not as far as I’m aware. However, looking around some more, we find Robert Goldblatt in Logics of time and computation
The most common practice in temporal logic is to regard time as an irreflexive ordering, so that “henceforth”, meaning “at all future times”, does not refer to the present moment. On the other hand, the Greek philosopher Diodorus proposed that the necessary be identified with that which is now and will always be the case. This suggests a temporal interpretation of that is naturally formalised by using reflexive orderings. The same interpretation is adopted in the logic of concurrent programs to be discussed in §9. (p. 44)
I suppose we could always write if that’s what we mean. Although that might not be compatible with a constructive viewpoint on the temporal ordering. One could I guess also go crazy and have both the reflexive and irreflexive orderings hanging around, inducing two different families of modalities.
I don’t know if “now” is used in temporal logic, but I envisaged it to denote something that it true at a particular time, and not elsewhen. Something like “It is 12 o’clock now”. Doesn’t “”, with no context, mean something that is “globally” true?
Re #30, that route has been taken. See here in a handbook on temporal logic in AI. It’s the difference between the box and the box with plus inside.
Re #31, this is all in the context of . You can base change a timeless proposition to one which is time-dependent but constant, e.g., (1+1 = 2)(t) is always true. Timed propositions can be stated to hold now and at no other time by .
Re #31, it’s just like how in modal logic, “” unadorned means that is true at the current world (or, more precisely, the “truth value” of unadorned is the set of worlds in which it’s true), not that it’s true in all worlds.
And even in natural language, if I say just “it is 12 o’clock”, what I mean is that it’s 12 o’clock now, not that it’s always 12 o’clock.
@David, your link takes me to a page unavailable for viewing or I have reached my viewing limit for this book.
Well, it’s just saying there are box modalities
Got it, thanks. Sounds like quite a menagerie!
OK, thanks for clarifying :-)
Added pointer to the nice slides that now go with Licata-Shulman.
In view of the Lawverian and so Hegelian roots of adjoint logic, and of our adjoint representation of time above, it’s perhaps interesting to read what R. G. Collingwood, much influenced by Hegel, has to say about time.
In Some Perplexities about Time: With an Attempted Solution, Proceedings of the Aristotelian Society, New Series, Vol. 26 (1925 - 1926), pp. 135-150, he is trying to find another way to think of time other than
“spatially,” as something existing totum simul
or as
a ceaseless flow in which the present, perpetually changing, sloughs off its states into the abyss of the past and acquires new states from the abyss of the future.
His solution is as follows:
The ideal is that which is thought, but not thought as real or existing; and in this class fall the future, which is possible but not necessary, and the past, which is necessary but not possible. The real is the present, conceived not as a mathematical point between the present and the past, but as the union of present and past in a duration or permanence that is at the same time change: the possible parting with its unnecessariness and the necessary parting with its impossibility in an actuality which is at once possible and necessary, not (like the abstract mathematical present) neither. Within this present there are, as really as you like, two elements (necessity and possibility), each of which taken singly or in isolation characterizes a being which is not real but ideal—the past and future respectively. Thus the past as past and the future as future do not exist at all, but are purely ideal; the past as living in the present and the future as germinating in the present are wholly real and indeed are just the present itself. It is because of the presence of these two elements in the present (not merely psychologically or illusorily, as in the doctrine of the specious present) that the present is a concrete and changing reality and not an empty mathematical point.
That which is ideal is for a mind, and has no other being except to be an object of mind. But the ideal and the real are not mutually exclusive. A thing may be ideal and also real. An example of this would be a duty, which is absolutely real in spite of the fact that it only exists for mind. But some things are merely ideal, and under this head fall the past and the future; unlike a duty, which exists only for thought but, for thought, really does exist, they have being for thought, but, even for thought, have no existence. Hence, if there were no mind, there would at any given moment be no past and no future; there would only be a present in which the past survived transformed and in which the future was present in germ. The past as past and the future as future, in contradistinction from their fusion in the present, have being for mind and only so. We do call the past, as such, into being by recollecting and by thinking historically; but we do this by disentangling it out of the present in which it actually exists, transformed, and re-transforming it in thought into what it was. Hence time, as succession of past, present and future, really has its being totum simul for the thought of a spectator, and this justifies its “spatialized” presentation as a line of which we can see the whole at once; it also justifies, so far as they go, subjectivist views of time like that of Kant. But time, as the ceaseless change of the present, is “transcendentally real,” and the logical presupposition of any thought whatever; and this justifies the “pure flux” view of time and its treatment in philosophies like that of Bergson and Mr. Alexander.
But this conception, though the only one I can discover which gives any hope of escape from my perplexities about time, is only open to a logic which conceives the real as a synthesis of opposites and a metaphysic which has abandoned the hopeless attempt to think of all objects of thought as existent. If we must regard the real as a collocation of elements each of which is real by itself and in its own right, we must give up the solution which I have attempted to sketch and find another, if we can.
Plenty of resonances there, such as adjunctions as ’synthesis of opposites’ and maybe a whiff of the kinematics/dynamics distinction of higher category theory and physics.
By the way, Bergson was supposedly instrumental in Einstein not receiving a second Nobel prize for relativity theory, the spatialization of time being evidently wrong.
So Bergson views time as a topological point (maybe infinitesimal thickened)?
I began a section on ’computation tree logic’ at temporal logic. I’ll add some more detail later. It seems a good candidate for adjoint logic treatment.
There is a curry-howard correspondence between LTL and reactive programming. This is similar to the temporal semantics in the topos of trees (presheaves over omega). http://www.itu.dk/~mogel/papers/lics2011.pdf
I can try to add more references later.
Interesting! So we’re in this neck of the woods described by Neel Krishnaswami:
Nick Benton and I have argued for programming explicitly with streams in our paper Ultrametric Semantics of Reactive Programs. Subsequently, Alan Jeffrey suggested using LTL as a type system in his paper LTL types FRP, an observation that Wolfgang Jeltsch also made in his paper Towards a Common Categorical Semantics for Linear-Time Temporal Logic and Functional Reactive Programming.
The difference between the view Nick and I take, and the one that Alan and Wolfgang take is best understood (IMO) by comparing the construction given in Birkedal et al.’s First steps in synthetic guarded domain theory: step-indexing in the topos of trees with Alan’s paper. The topos of trees (presheaves over the natural numbers ordered by size) is very similar to category of ultrametric spaces Nick and I used, but much easier to compare with Alan’s category (presheaves over a discrete category of time), since these are both presheaf categories.
If you’re interested in futures specifically for concurrency, then it might be a better idea to look at CTL rather than LTL, though. AFAIK, that’s presently unexplored territory!
Has anyone done a Curry-Howard correspondence for CTL or even CTL?
What we were doing above (in section 5 of temporal logic) seems to point in a very similar direction to
Reactive types there are our -dependent types. Perhaps we were doing better. Jeffrey writes:
The implementation described here is based on interval types, rather than reactive types. This might be a better match to an interval temporal logic rather than LTL, which would admit the “chop” modality
That ’chopping’ of an interval is what Mike is addressing in #18.
Here’s a recent paper.
Temporal Type Theory: A topos-theoretic approach to systems and behavior Patrick Schultz, David I. Spivak https://arxiv.org/abs/1710.10258
It is an application of Sh(Int) to hybrid systems. Int is the interval domain.
They relate it to discrete conduce fibrations, Martin-Lof has also been lecturing on such interval type theories. I am not sure whether there is a public record of the latter. In any case, I suspect Jeffrey’s work to be related to such discrete conduce fibrations.
With the development from unary (the adjoint logic of #1) to simple to the hotly anticipated dependent type 2-theories, what’s a good way to organise this material for nLab purposes? Is there any future for the term ’adjoint logic’ (something of a pleonasm if we follow Lawvere, “we may more precisely identify logic with this scheme of interlocking adjoints”)?
So the 2-category of modes for a temporal logic we developed above in this discussion can be formulated as a superposed pair of walking adjunctions, generating cross-pair monads (#8).
If one were to look to depict a type 2-theory diagrammatically, I guess you’d have a sheet for each mode, on which the types associated with the mode can be written. In the unary case, we could have the adjoint functors transferring types between sheets.
One of these days, we’ll find that Peirce was ahead of his time again, with his tinctured gamma graphs: “the tinctures enable Peirce to distinguish between different modes of modalities” (Peirce’s Search for a Graphical Modal Logic, p. 159)
Yeah, I don’t really like the term “adjoint logic” either.
Re my Peirce comment in #47, slides like 191 of Dan Licata’s HiM talk are suggestive.
1 to 49 of 49