Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
1 to 25 of 25
A friend of mine just posted the following quote to his Facebook page and it brought up the question in my mind of whether or not there was a categorical way to describe entropy since it seems like there could be (and it might offer some improvements on existing descriptions which can be rather disparate). Anyone know if such a thing exists?
The quote: “A mathematician is a device for turning coffee into theorems” - Alfréd Rényi
It should be possible to define the entropy of a category (in a way similar to cardinality). So then, the trick, as always, is to start with an interesting category. Since entropy is, in a way, a counting procedure, it is probably related to decategorification somehow.
Considering the links with partition functions/generating functions, there may be some sort of way to approach it via species (I know that is just a pointer to elsewhere, but perhaps someone will fill it in). This is just a random guess, so it may not be appropriate, but learning about species surely does one’s category-theoretic muscles good :-)
@Eric: That’s a huge stretch. The cardinality of a category is a generalization of the cardinality of a set. Sets in general don’t have a notion of entropy, so I’m not sure how your reasoning applies.
Did you ever hear of brainstorming?
Ian. This is probably a good place to start:
I understand that you were brainstorming, but I was explaining why it didn’t make sense to me. It’s not like I said “wow, you’re a big idiot!”
Sets in general don’t have a notion of entropy
They do if they have some kind of structure to them. So, for example, the sets that underly permutation groups can be assigned an entropy since, in one of its most general forms (as envisaged by Shannon) entropy is another way to count the number of configurations that something can have (there’s a widely-believed myth that it’s all based on probabilities, but even Shannon admitted it didn’t have to be).
@Eric: Cool! Thanks for the references.
Edit: Just glanced at the first and I think this runs along the same lines I was thinking (i.e. in keeping with Shannon’s very general notion).
Lawvere’s State Categories, Closed Categories, and the Existence Semi-Continuous Entropy Functions is worth consulting.
Thanks David!
Hi expixpi,
it seems that you want to communicate some original thoughts or ongoing reserach. Therefore I suppose it would help if you could point to a document where the ideas you are sketching are laid out. Then people could have a look and could react, if reaction is what you are after.
For an answer to the original question, “Is there a categorical way to describe entropy”, try:
It’s a description of Shannon entropy as the unique functor with certain properties.
This thread caught my attention because Tobias Fritz and I are busy finishing up a similar (but, it turned out, harder to prove) characterization of relative entropy. Right now this is a draft, but in a few weeks I hope it’ll be done!
Now, back to what I was going to do…
By the way, the original question at the top of this thread is from May 2010, almost four years ago, and the OP is not around here anymore. If you do want to inform him of your articles, then you may have to contact him by email.
John, re #14, I keep meaning to get back to my Bayesian roots. Your morphisms in FinStat seem to be equivalent to Cencov’s markov morphisms, see e.g., Lebanon’s paper section 4.
Hmm, looking back I guess there are some differences. Markov embeddings, I think, are just your , but between sets and without mention of distributions and . On each there sits the space of distributions and the only metric on this space preserved by all Markov embeddings is the Fisher metric.There’s a family of distances/divergences compatible with the metric, so you must be invoking some stronger constraint, Zhu 1997 (pdf).
TonyJones,
thanks for the links. I just looked at them briefly. In both cases it seems to me that they use the term “entropy” mainly since they need some word.
Regarding “entropy” and “category”: I am never quite sure what people are after who wish to combine these two words. For instance the original question in this thread could also be “Is there a categorical way to describe chocolate?” and I would equally be hoping that the question were made more specific. In the end, whenever you come up with a decent definition of anything, likely category theory will help to think about it. That’s the whole point of category theory.
Regarding abstract characterization of entropy: from a random collection of a plethora of articles that discuss this, there are now listed two at entropy – References – Axiomatic characterizations.
The reference
there gives a characterization that is dead simple: it observes that to characterize vN entropy of density matrices it is sufficient to assume that the functional takes larger values on larger systems and specifically takes times its value on copies of a single system.
It would seem that if you have an ambient category with a minimum of structure that allows to speak of probability distributions/density matrices in the first place, then likely these simplistic axioms may also be formulated in that category.
For instance density matrices may be neatly formalized in suitable dagger-monoidal categories, in a flavor of linear logic. To state these simple axioms in there for an entropy function on the space of density matrices, it seems all that one may need to require in addition is some object that plays the role of the real line in which the entropy is supposed to take values. One needs just that it has a linear order and contains natural numbers.
Maybe you are looking for synthetic complexity theory.
But, as I said by email, that will need a bit more than just the observation that exponential objects are a categorification of, yes, exponentials, and hence that there are “logarithm objects”. That’s clear. Now one might ask what a useful formulation, in this sense, of an expression for the entropy in the form might be. Sure. But you still need to do that, I think.
1 to 25 of 25