# Start a new discussion

## Not signed in

Want to take part in these discussions? Sign in if you have an account, or apply for one below

## Site Tag Cloud

Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.

• CommentRowNumber1.
• CommentAuthorTodd_Trimble
• CommentTimeFeb 16th 2012

Wrote a bunch of stuff on determinant. Just because I felt like it. But I’ve run out of steam to do more on it now.

• CommentRowNumber2.
• CommentAuthorTim_Porter
• CommentTimeFeb 16th 2012

If I had the courage I would put in a bit on the link with alg. K-theory. … but I don’t have at the moment!

• CommentRowNumber3.
• CommentAuthorTodd_Trimble
• CommentTimeFeb 18th 2012

I rewrote (and improved) some of what I had written before on determinant, now including a discussion of the Cayley-Hamilton theorem, and a perhaps surprising consequence that for a finitely generated module over a commutative ring, a surjective endomorphism is necessarily an isomorphism. (I knew this for free and even finitely generated projective modules, but not for finitely generated modules generally before last night.)

As Tom Leinster knows, this pertains to a current discussion at MO, which in large part circles around his Eventual Image posts at the Café.

• CommentRowNumber4.
• CommentAuthorzskoda
• CommentTimeNov 10th 2012

I split off characteristic polynomial (with the proof of the Cayley-Hamilton theorem) from determinant which grew too big. Added more links to related entries at determinant and at matrix.

• CommentRowNumber5.
• CommentAuthorTodd_Trimble
• CommentTimeAug 9th 2015

In the beginning of determinant, I included a brief description of alternating powers in terms of superalgebra (in view of a proof which needed improving (-: ).

• CommentRowNumber6.
• CommentAuthorDavidRoberts
• CommentTimeAug 10th 2015

Regarding the description in the Idea section, Max Kelly defines the determinant using this characterisation in his algebra textbook, specifying it’s the alternating multilinear map taking the identity matrix to 1.

• CommentRowNumber7.
• CommentAuthorMike Shulman
• CommentTimeAug 11th 2015

So does Apostol, in his linear algebra and multivariable calculus book.

• CommentRowNumber8.
• CommentAuthorTodd_Trimble
• CommentTimeAug 11th 2015

Re #6 and #7: right. IMO it’s really the only sensible way to define it. And those (alternating multilinearity, det of identity is 1) are in turn more or less direct consequences of the change of volume idea from the real case, which is the idea that should first be explained to undergraduates.

• CommentRowNumber9.
• CommentAuthorUrs
• CommentTimeOct 20th 2018
• CommentRowNumber10.
• CommentAuthorTodd_Trimble
• CommentTimeOct 20th 2018

Looks interesting! It should be possible to decipher this formula species-theoretically, but I can’t see their appendix (paywall).

• CommentRowNumber11.
• CommentAuthorUrs
• CommentTimeOct 20th 2018

I’ll send you a copy of the appendix.

But I would expect that the result should have a simple derivation that has occurred to others. On PlanetMath here they suggest to transform to diagonal matrices and then invoke elementary symmetric polynomials.

Being lazy, I haven’t tried to make that a proof, and Wikipedia here suggests that somewhat unlikely particle physics article as their only reference for a proof. But this must be in some algebra textbook, too, one would hope.

• CommentRowNumber12.
• CommentAuthorTodd_Trimble
• CommentTimeOct 20th 2018

I’ll send you a copy of the appendix.

Thanks! Got it.

On PlanetMath here they suggest to transform to diagonal matrices and then invoke elementary symmetric polynomials.

That kind of thing seems eminently sensible. I seem to dimly recall a six-author paper on algebraic combinatorics with names like Krob, Schützenberger, I.M. Gelfand that might have this type of result, either explicitly or in camouflage. Here it is. I’m vaguely remembering the various change-of-basis formulas for bases of symmetric polynomials, which go by names like Newton’s formula (around page 8 of 111 in the pdf), very reminiscent of what appears on the PlanetMath page. I feel this shouldn’t be too hard to track down.

• CommentRowNumber13.
• CommentAuthorTodd_Trimble
• CommentTimeOct 20th 2018
• (edited Oct 20th 2018)

Ah, so right. Let me see how to at least get started on this. For polynomial variables $x_1, \ldots, n$, introduce generating functions for the elementary symmetric functions

$\sum_{k \geq 0} \sigma_k(x_1, \ldots, x_n) t^k \coloneqq \prod_{i=1}^n (1 + x_i t)$

which I’ll abbreviate to $\sigma(x, t)$. Take the logarithmic derivative wrt $t$ of $\sigma(x, -t) = \prod_{i=1}^n (1 - x_i t)$ to get

$\frac{\frac{d}{d t} \sigma(x, -t)}{\sigma(x, -t)} = \sum_{i=1}^n \frac{-x_i}{1 - x_i t} = -\sum_{k \geq 0} \left(\sum_{i=1}^n x_i^{k+1}\right) t^k$

where you see the sums of powers $p_k(x_1, \ldots,x_n) = x_1^k + \ldots + x_n^k$ appearing on the right, which will figure into traces of powers (of eigenvalues, etc.).

Multiplying both sides by $\sigma(x, -t)$ and matching coefficients, one arrives at the Newton identity

$k\sigma_k(x_1, \ldots, x_n) = \sum_{i=1}^k (-1)^{i-1} \sigma_{k-i}(x_1, \ldots, x_n) p_i(x_1, \ldots, x_n)$

mentioned in the Wikipedia article. This allows one to solve for the $e_i$ recursively in terms of the $p_i$, and then you just want the formula for $e_n$. I’m sure there’s an elegant species way to view this (possibly connected with a nice nForum comment the other day on species of connected types and their connection with logarithms). I sense we’re getting closer…

Edit: That nice comment was actually the remark of the last bullet point here that was recently added by Abdelmalek Abdesselam.

• CommentRowNumber14.
• CommentAuthorUrs
• CommentTimeOct 20th 2018

• CommentRowNumber15.
• CommentAuthorTodd_Trimble
• CommentTimeOct 20th 2018
• (edited Oct 20th 2018)

Okay, my last comment ended up being pretty circuitous, but at least I picked up the spoor. Let me say it again differently.

$\array{ \sigma(x, t) & = & \prod_{i=1}^n (1 + x_i t) \\ & = & \exp\left(\sum_{i=1}^n \log(1 + x_i t)\right) \\ & = & \exp\left(\sum_{i=1}^n \sum_{k \geq 1} (-1)^{k+1} \frac{x_i^k}{k} t^k \right)\\ & = & \exp\left( \sum_{k \geq 1} (-1)^{k+1} \frac{p_k}{k} t^k\right) }$

where $p_k = x_1^k + \ldots + x_n^k$ corresponds to the trace of the $k^{th}$ power. In particular, matching the coefficients of $t^n$, the determinant corresponds to a gobbledygook formula

$x_1 x_2 \ldots x_n = \sigma_n(x) = \sum_{n = k_1 + 2k_2 + \ldots + n k_n} \prod_{i=1}^n \frac1{(k_i)!} \left(\frac{p_i}{i}\right)^{k_i} (-1)^{k_i+1}$

which is also in the Wikipedia article, and that’s about all the energy I have for this now. Well, I guess I’ll summarize it like this:

$\det(A) = \sum_{n = k_1 + 2k_2 + \ldots + n k_n} \prod_{i=1}^n \frac1{(k_i)!} \left(\frac{tr(A^i)}{i}\right)^{k_i} (-1)^{k_i+1}$
• CommentRowNumber16.
• CommentAuthorTodd_Trimble
• CommentTimeOct 21st 2018

Added a proof of the identity involving traces of powers of the matrix.

• CommentRowNumber17.
• CommentAuthorUrs
• CommentTimeOct 21st 2018

Thanks, Todd!

• CommentRowNumber18.
• CommentAuthorUrs
• CommentTimeMar 16th 2020
• (edited Mar 16th 2020)

added the expression of the determinant via Levi-Civita symbol and Einstein summation convention (here)

1. Added combinatorial factor in determinant formula (DeterminantInTermsOfLCSymbols)

Alex S Arvanitakis