Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
Wrote a bunch of stuff on determinant. Just because I felt like it. But I’ve run out of steam to do more on it now.
If I had the courage I would put in a bit on the link with alg. K-theory. … but I don’t have at the moment!
I rewrote (and improved) some of what I had written before on determinant, now including a discussion of the Cayley-Hamilton theorem, and a perhaps surprising consequence that for a finitely generated module over a commutative ring, a surjective endomorphism is necessarily an isomorphism. (I knew this for free and even finitely generated projective modules, but not for finitely generated modules generally before last night.)
As Tom Leinster knows, this pertains to a current discussion at MO, which in large part circles around his Eventual Image posts at the Café.
I split off characteristic polynomial (with the proof of the Cayley-Hamilton theorem) from determinant which grew too big. Added more links to related entries at determinant and at matrix.
In the beginning of determinant, I included a brief description of alternating powers in terms of superalgebra (in view of a proof which needed improving (-: ).
Regarding the description in the Idea section, Max Kelly defines the determinant using this characterisation in his algebra textbook, specifying it’s the alternating multilinear map taking the identity matrix to 1.
So does Apostol, in his linear algebra and multivariable calculus book.
Re #6 and #7: right. IMO it’s really the only sensible way to define it. And those (alternating multilinearity, det of identity is 1) are in turn more or less direct consequences of the change of volume idea from the real case, which is the idea that should first be explained to undergraduates.
Added a section As a polynomial in traces of powers
Looks interesting! It should be possible to decipher this formula species-theoretically, but I can’t see their appendix (paywall).
I’ll send you a copy of the appendix.
But I would expect that the result should have a simple derivation that has occurred to others. On PlanetMath here they suggest to transform to diagonal matrices and then invoke elementary symmetric polynomials.
Being lazy, I haven’t tried to make that a proof, and Wikipedia here suggests that somewhat unlikely particle physics article as their only reference for a proof. But this must be in some algebra textbook, too, one would hope.
I’ll send you a copy of the appendix.
Thanks! Got it.
On PlanetMath here they suggest to transform to diagonal matrices and then invoke elementary symmetric polynomials.
That kind of thing seems eminently sensible. I seem to dimly recall a six-author paper on algebraic combinatorics with names like Krob, Schützenberger, I.M. Gelfand that might have this type of result, either explicitly or in camouflage. Here it is. I’m vaguely remembering the various change-of-basis formulas for bases of symmetric polynomials, which go by names like Newton’s formula (around page 8 of 111 in the pdf), very reminiscent of what appears on the PlanetMath page. I feel this shouldn’t be too hard to track down.
Ah, so right. Let me see how to at least get started on this. For polynomial variables $x_1, \ldots, n$, introduce generating functions for the elementary symmetric functions
$\sum_{k \geq 0} \sigma_k(x_1, \ldots, x_n) t^k \coloneqq \prod_{i=1}^n (1 + x_i t)$which I’ll abbreviate to $\sigma(x, t)$. Take the logarithmic derivative wrt $t$ of $\sigma(x, -t) = \prod_{i=1}^n (1 - x_i t)$ to get
$\frac{\frac{d}{d t} \sigma(x, -t)}{\sigma(x, -t)} = \sum_{i=1}^n \frac{-x_i}{1 - x_i t} = -\sum_{k \geq 0} \left(\sum_{i=1}^n x_i^{k+1}\right) t^k$where you see the sums of powers $p_k(x_1, \ldots,x_n) = x_1^k + \ldots + x_n^k$ appearing on the right, which will figure into traces of powers (of eigenvalues, etc.).
Multiplying both sides by $\sigma(x, -t)$ and matching coefficients, one arrives at the Newton identity
$k\sigma_k(x_1, \ldots, x_n) = \sum_{i=1}^k (-1)^{i-1} \sigma_{k-i}(x_1, \ldots, x_n) p_i(x_1, \ldots, x_n)$mentioned in the Wikipedia article. This allows one to solve for the $e_i$ recursively in terms of the $p_i$, and then you just want the formula for $e_n$. I’m sure there’s an elegant species way to view this (possibly connected with a nice nForum comment the other day on species of connected types and their connection with logarithms). I sense we’re getting closer…
Edit: That nice comment was actually the remark of the last bullet point here that was recently added by Abdelmalek Abdesselam.
Thanks for the alert on that comment about connected partition functions. Have added a comment there.
Okay, my last comment ended up being pretty circuitous, but at least I picked up the spoor. Let me say it again differently.
$\array{ \sigma(x, t) & = & \prod_{i=1}^n (1 + x_i t) \\ & = & \exp\left(\sum_{i=1}^n \log(1 + x_i t)\right) \\ & = & \exp\left(\sum_{i=1}^n \sum_{k \geq 1} (-1)^{k+1} \frac{x_i^k}{k} t^k \right)\\ & = & \exp\left( \sum_{k \geq 1} (-1)^{k+1} \frac{p_k}{k} t^k\right) }$where $p_k = x_1^k + \ldots + x_n^k$ corresponds to the trace of the $k^{th}$ power. In particular, matching the coefficients of $t^n$, the determinant corresponds to a gobbledygook formula
$x_1 x_2 \ldots x_n = \sigma_n(x) = \sum_{n = k_1 + 2k_2 + \ldots + n k_n} \prod_{i=1}^n \frac1{(k_i)!} \left(\frac{p_i}{i}\right)^{k_i} (-1)^{k_i+1}$which is also in the Wikipedia article, and that’s about all the energy I have for this now. Well, I guess I’ll summarize it like this:
$\det(A) = \sum_{n = k_1 + 2k_2 + \ldots + n k_n} \prod_{i=1}^n \frac1{(k_i)!} \left(\frac{tr(A^i)}{i}\right)^{k_i} (-1)^{k_i+1}$Thanks, Todd!
1 to 19 of 19