Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
I hope this is an appropriate use of the forum. I’d like to find out what proportion of mathematicians are aware of a certain fact, and I’m not feeling in the mood for asking at the Café, so I thought I’d try here.
Let and be finite-dimensional vector spaces over an algebraically closed field. Let and be linear maps. Then and have the same eigenvalues, with the same algebraic multiplicities, with the possible exception of 0.
It’s not terribly hard to prove this fact. But the question is: did you already know this, off the top of your head?
(I can say something about why I’m asking, but I’d rather wait until I’ve got a few answers.)
Thanks!
I think I knew this once…. last century! but can’t be sure.
I did.
I didn’t.
I knew that, at least I recall it now you mention it. I can’t remember where from. Maybe looking over a third year undergraduate essay my nephew wrote.
The first part (eigenvalues the same) yes, but not the second about the multiplicities (that part even surprises me!).
It’s sort of fun watching you pursue themes, Tom. You seem to be on something like a fixed-point kick these days. Or is that putting it too reductionistically? :-)
Todd, frankly I have little idea what I’m doing. But it’s nice to know I’m creating fun :-)
I might have heard that once before, but I certainly didn’t remember it.
I hope this is an appropriate use of the forum.
Absolutely.
As for your actual question, I knew this (or maybe I know that I know it, if you see what I mean) but the version that springs to mind concerns compact operators on a Banach space.
I can even remember how you prove it (in particular, I remember the proof - I’m not figuring out the proof for myself).
Thanks for your answers, all. Here’s the story. A few years ago I became aware of Sheldon Axler’s book Linear Algebra Done Right (which in fact prompted my very first Café post). Ever since then I’ve been thinking on and off about the “dynamical” approach, as I think of it, to linear algebra. There, a central role is played by the eventual kernel of an operator , which is the union of the chain of subspaces
In this approach, different questions suggest themselves and lots of things become easy. One of the things I learned was this fact about eigenvalues. It struck me that I hadn’t known this fact before, given how elementary it is. And when I’ve mentioned it to people in conversation since, I’ve discovered that others have been equally unaware. With the thought at the back of my mind that I might, some day, write something about this, I was wondering how widespread that ignorance was.
One interesting thing about this result, I think, is the “nonzero” condition. It suggests that the nonzero spectrum (I mean, the set of nonzero eigenvalues, with their algebraic multiplicities) might have some importance as an object in its own right. Write for the nonzero spectrum of an operator on a finite-dimensional vector space. Then has the trace-like property . Of course, the trace is the sum of the nonzero spectrum, so the identity follows — and everyone knows that identity. Maybe Mark or Andrew have some thoughts about all this; I’m aware that I don’t know how the story goes in more sophisticated functional-analytic settings.
One reason I thought you might have been thinking about fixed-point theory is that it gives a really easy way of seeing the truth of this result. Namely, the result holds true for the eigenvalue 1, because we obviously have a linear isomorphism
but then the same result must hold true for any non-zero eigenvalue , by replacing say by .
Edit: It occurs to me that maybe I was misinterpreting what Tom is saying; perhaps by ’algebraic multiplicity’, he means the dimension of the generalized eigenspace attached to the eigenvalue , not the dimension of the eigenspace itself. In other words, the dimension of the eventual kernel of .
I don’t think that I’ve ever heard this. Your explanation in terms of the nonzero spectrum is quite interesting.
Todd, yes, by the algebraic multiplicity of at I did mean the dimension of the appropriate generalized eigenspace (what I called above the “eventual kernel” of ). As I’m sure you know, this is the same as the power of appearing in the characteristic polynomial of .
(I was brought up to say geometric multiplicity for the dimension of and algebraic multiplicity for the quantity just described. Maybe that’s not universal. The inequality “geom mult alg mult” is obvious if you define the alg mult as the dimension of the generalized eigenspace, but not so obvious if you define it as the power in the characteristic polynomial.)
But anyway, Todd’s proof needs to be adapted only slightly to get a proof of the fact I mentioned. We have the following linear maps between eventual kernels:
They’re not mutually inverse. However, it’s a general fact that acts as an automorphism of the subspace whenever is an operator and . So acts as an automorphism of the subspace , and similarly , as long as . Hence both composites of the two maps in the display above are automorphisms, and it follows that the two maps themselves are invertible. Thus .
As Todd points out, a similar but slightly easier argument says that and also have the same nonzero eigenvalues with the same geometric multiplicities.
Yes, I had a similar proof. Taking for example, I observed that in the diagram you drew, we have that is an automorphism because it’s of the form where is nilpotent, and has an inverse given by a finite geometric series . Similarly is an automorphism. Thus both and are invertible. The case for general nonzero follows from the case by the same trick I used before (replace by ).
Strangely enough, I never had a course in linear algebra. So I may not know all the standard terminology. :-)
So the ‘algebraic’ multiplicity is just as geometric as the ‘geometric’ one!
@Toby: I want to rename it the dynamic multiplicity. The eventual kernel of an operator says something about the long-term behaviour of under iteration. For example, is the same for all .
I’ve never understood this usage of ’algebraic’ and ’geometric’, anyway.
I think that it’s just that the algebraic multiplicity appears in the characteristic polynomial (which is algebraic) while the geometric multiplicity appears as the dimension of a space (which is geometric). So what I learnt today is that the algebraic multiplicity is also the dimension of an interesting space.
1 to 18 of 18