What exactly is the determinant, anyway?
by greenturtle3141, Nov 26, 2021, 11:19 AM
Reading Difficulty: 3-4/5
Prerequisites: You should know what
denotes and how to compute it.
Preliminary notes:
So what's a determinant...? We learn what it is and what it does sometimes, but what motivates looking at it?
Part 0: Matrices vs. Linear Transformations
Recall (or, learn right now) that a linear transformation is a function
that is... er, linear. That is:
Now the big question that linear algebra teachers forget to ask.
Q: What is the difference between matrices and linear transformations?
A: THEY'RE THE SAME. EXACT. THING.
Yes! That's the whole flippin point of matrices! They're LITERALLY FUNCTIONS. FUNCTIONS THAT TURN VECTORS INTO OTHER VECTORS.
Let me spend some time convincing you that this is true (caveat: only works in finite dimensional vector spaces because infinitely large matrices aren't nice).
Case Study 1: Multiplying by a column vector
Consider the matrix
. This can be viewed as a function from
to
. How?
To figure out where the matrix takes the coordinates/vector
, all you have to do is compute
. If you remember your multiplication rules, this is easy:
So the matrix may be viewed as a function
defined by
. Hey, isn't that a linear transformation?
Case Study 2: Composition
From the above discussion, what sort of function is represented by an
matrix
? With some thinking, you should conclude that
is a linear transformation from
to
.
If you recall matrix multiplication rules, the product
is only valid if the number of columns of
matches the number of rows of
. That is,
needs to be an
matrix for some
. In this case,
would be a function
.
Ok cool, but I promised that matrices are the same thing as linear transformations. If we're supposed to view all matrices as linear transformations, what sort of function is
supposed to be?
Spoiler Alert: Matrix multiplication is just a composition of linear transformations! That is, if
is viewed as a linear transformation
and
is viewed as a linear transformation
, then
corresponds to the composition
. This makes sense because
ends up being an
matrix. Now do you see why you have the weird dimension-matching rule for matrix multiplication? It's literally just to make sure that the range of
will match the domain of
.
To summarize:
As such, linear transformations and matrices are essentially the same thing, and hence should be treated as the same thing. This lets us discuss not only the determinants of matrices, but also the determinants of linear transformations. Indeed, this is ultimately what we're going to do.
Part 1: Bad Definitions
Definition 1: The determinant is a number associated with a matrix, computed by multiplying and adding a bunch of random numbers in it, and it's really special because you need to know it to pass the ACT math section. Concretely, it is given by:
And can be computed more easily using expansion by minors. The determinant has some nice properties, just trust me I swear!
Verdict: This definition sucks. But a bunch of people teach it this way. Probably because they have no idea what it does either!
This is bad and it doesn't tell us ANYTHING about this weird number. Another major drawback is that it's really not obvious how to prove important properties such as
. Getting these properties requires a deeper understanding of the determinant... So much so that it requires a whole new definition.
Definition 2: Let
be an
matrix. The key is to interpret
visually as a linear transformation.
"stretches" the unit cube by some amount.

(Image stolen from https://emoy.net/Determinant)
The determinant is just how many times bigger the cube gets in volume. In general,
increases the volume of any shape by a factor of
.
Verdict: This is a better definition in the sense that it gives you an idea of why you'd ever care about it. In fact, it makes some of the properties of determinants make sense. For example,
can be read as "If you take a shape and transform it under
and then under
, then its volume is first multiplied by
and then multiplied by
". The only issue is that this isn't quite rigorous and it doesn't give us a way to actually compute the determinant of some given matrix.
With that out of the way, we can begin sussing out the "true identity" of the matrix.
Part 2: Bilinear maps and Multilinear maps
We discussed what a linear map (or, linear transformation) was in Part 0. Now get ready for bilinear maps! It's nothing complicated, I promise.
Definition: A function
from
to
is called bilinear if it is linear in each component. That is:
Test Your Understanding
Not so bad eh? Let's move on to multilinear maps.
Definition: A function
from
to
is called multilinear if it is linear in each component.
Wait, isn't that basically the same exact definition?
Test Your Understanding
Multilinear maps aren't that fantastic. They really only start to shine once you give them one extra property...
Part 3: Alternating Multilinear Maps
Definition: A multilinear map
is alternating if its value is negated upon switching two components.
For example, consider the multilinear map
. This is alternating because if I switch two components (er, the only two components in this case), then I'd get
.
Test Your Understanding
The above stuff is important, but this next result is even more important, so it gets the stage here. Let
,
, ...,
. We call this the "standard basis".
THEOREM: If
is multilinear and alternating, then
is fully determined by the value of
. That is, if you tell me just the value of
, then I can give you the value of
everywhere else!
Proof. I'm going to cheat and let
. The key idea is that since
form a basis, I can write every vector in
in terms of them, i.e. every
takes the form
for some
. I mean, that's pretty easy to see because if
then
.
So, what's the point? Basically, for any
, I can first write:
Then I can abuse the fact that
is multilinear to keep expanding this more and more! Expanding out the first component:
Expanding out the second component:


Expanding out the thir- lol I'm kidding, it's 4 AM why would I do that? But you can see that in the end, we're left with
terms of the form
.
can be written in terms of
. 
Corollary: If
is multilinear and alternating, then
for a constant
that does NOT depend on
. It only depends on
.
You can see that alternating multilinear maps have some pretty neat properties!
...we're eventually going to get to determinants, I promise.
Part 4: Almost a Determinant...
Let
be a linear map. I'm now ready to give you a form for its determinant.
Let
be some mutlilinear, alternating map. By the previous theorem, if I set
, then the rest of
must be completely determined. In fact, I can now tell you the value of
. This is the determinant of
.
Why should you believe me? I'll give you two reasons.
. We need to explore more...
Part 5: The Universal Property
You're probably wondering what the word "universal" means.
Layman's Definition: Something is "universal" if it looks like everything else.
Abstractatition's Definition: A universal object in a category is an initial object. Or something. Specifically in this context we're considering the category of multilinear maps with morphisms that are linear maps, I think. idk anymore help
Let's start simple, before the notation gets a bit heavy: Let's say I have a linear map from
to
. Call this map
. I'll say that
"looks like" another linear map
if
for some linear map
. For example, if
and
, then
certainly looks like
. This is because I can just consider
. Then
. In fact, I can find such an
no matter what linear map
you choose (note that e.g.
is not linear. In fact, any linear map
takes the form
). Because of this,
"looks like" everybody else, so we say that
is universal.
Not all such linear maps are universal. For example,
is not a universal linear map. (Why?)
Let's now ramp it up. Suppose
and
are both multilinear and alternating. We say that
"looks like"
if there exists a linear map
for which
.
You might be able to guess what's coming.
THEOREM: Let
be multilinear alternating and non-zero. Then in fact,
is universal.
By non-zero, I mean that the map doesn't just send everything to
. Note that we need to specify some specific "dimensions" for the domain and range of
for this to work. Particularly, the "domain dimension" is "
" whereas the "range dimensions" is
.
Proof. Let
be multilinear and alternating. It suffices to find a linear
for which
.
The strategy in this proof is to first figure out a formula for
, assuming it exists. In other words, you should ask yourself "Hm, what would
have to look like?" Then, given that, we prove that this "formula" is actually valid!
Since
is non-zero, we can find vectors
such that
. Let
, and let
. Now think: Given
and
, could we write a formula for
that should work in theory, if it had to exist? Well,
would need to send
to
, so the only formula that could possibly work is
.
Hm, does it work? To show that it works, we need to prove that
. This means that we need to prove that:
Or:
Hm, I guess that means we just need to show this equality:
Wait but, by that corollary we did, there exists a constant
such that
. And
doesn't depend on which multilinear map you're using, so the same
works for
. That is,
, Similarly, there exists a constant
such that
and
. Plugging this in, we want to show that:
...which is true! 
Part 6: The Determinant Unmasked
DEFINITION: Let
be a linear map. Let
be a non-zero multilinear alternating map. Then
is universal. Note also that the map
defined by
is also multilinear and alternating. Therefore, since
is universal, there must exist a linear map
for which
. Since
,
must take the form
. Hence,
may be written as:
And this holds for all
. Notice that
is a special constant that depends only on
. We define
as the determinant of
. It is the "scaling factor" of
!!!
Part 7: All Hail Abstraction
THEOREM:
Proof. Take a non-zero alternating multilinear map
. Then:
On the other hand:
Boom. 
THEOREM: The determinant itself is a multilinear alternating map! In fact, it is the unique such map that sends the identity matrix/map to
.
Proof.
Take the standard basis:
Now view
as a map from
to
by simplying interpreting matrices as
-tuples of
-vectors (so we view
as a tuple of column vectors
). Then for all
, we have:

is just some dumb constant. The moving part is
, and this is multilinear and alternating. Hence so is
.
By plugging in
etc. we get that
, clearly. But note that
, when interpreted as a matrix, is actually the identity!
We claim that
is the only such multilinear alternating map that takes the identity matrix to
. To see this, let
be another multilinear alternating map. Since
is non-zero, it must be universal. Hence there exists
for which
. But the values of
and
agree on the matrix
, so
has a non-zero fixed point. It follows that
is the identity linear map, so
. Hence the uniqueness of
is proven.
.
THEOREM: Adding one column to another does not change the determinant of a matrix.
Proof. The determinant is multilinear and alternating.
THEOREM: Multiplying a column by
will multiply the determinant by
.
Proof. The determinant is multilinear and alternating.
If you want, you can even recover that original horrible formula without too much hassle.
THEOREM:
where
.
Proof. Just show that
I leave this to you to verify. 
The only other major property left would probably be
, and to my knowledge this is best proven via this horrible formula.
Welp, that's it! Hopefully this post gets you to see the determinant in a new light.
Prerequisites: You should know what

Preliminary notes:
- The discussion here generalizes over vectors spaces over any field, but for simplicity's sake we'll pretend that the vector space is
throughout.
- Moreover, we'll take the standard basis of
for our matrix representations. If you don't understand what that means, worry not.
So what's a determinant...? We learn what it is and what it does sometimes, but what motivates looking at it?
Part 0: Matrices vs. Linear Transformations
Recall (or, learn right now) that a linear transformation is a function

for any vectors
for any vector
and scalar
.
Now the big question that linear algebra teachers forget to ask.
Q: What is the difference between matrices and linear transformations?
A: THEY'RE THE SAME. EXACT. THING.
Yes! That's the whole flippin point of matrices! They're LITERALLY FUNCTIONS. FUNCTIONS THAT TURN VECTORS INTO OTHER VECTORS.
Let me spend some time convincing you that this is true (caveat: only works in finite dimensional vector spaces because infinitely large matrices aren't nice).
Case Study 1: Multiplying by a column vector
Consider the matrix



To figure out where the matrix takes the coordinates/vector





Case Study 2: Composition
From the above discussion, what sort of function is represented by an





If you recall matrix multiplication rules, the product








Ok cool, but I promised that matrices are the same thing as linear transformations. If we're supposed to view all matrices as linear transformations, what sort of function is

Spoiler Alert: Matrix multiplication is just a composition of linear transformations! That is, if










To summarize:
- Any matrix
is literally just a linear transformation. This is because it's pretty easy to show that
and
.
- Conversely, and linear transformation
also has a unique matrix representation. I leave this as an exercise!
As such, linear transformations and matrices are essentially the same thing, and hence should be treated as the same thing. This lets us discuss not only the determinants of matrices, but also the determinants of linear transformations. Indeed, this is ultimately what we're going to do.
Part 1: Bad Definitions
Definition 1: The determinant is a number associated with a matrix, computed by multiplying and adding a bunch of random numbers in it, and it's really special because you need to know it to pass the ACT math section. Concretely, it is given by:

Verdict: This definition sucks. But a bunch of people teach it this way. Probably because they have no idea what it does either!
This is bad and it doesn't tell us ANYTHING about this weird number. Another major drawback is that it's really not obvious how to prove important properties such as

Definition 2: Let





(Image stolen from https://emoy.net/Determinant)
The determinant is just how many times bigger the cube gets in volume. In general,


Verdict: This is a better definition in the sense that it gives you an idea of why you'd ever care about it. In fact, it makes some of the properties of determinants make sense. For example,





With that out of the way, we can begin sussing out the "true identity" of the matrix.
Part 2: Bilinear maps and Multilinear maps
We discussed what a linear map (or, linear transformation) was in Part 0. Now get ready for bilinear maps! It's nothing complicated, I promise.
Definition: A function



- (Linearity in the first component)
and
- (Linearity in the second component)
and
Test Your Understanding
- Is the map
bilinear? Answer
No. It's actually not linear in either component. For example,. If it were bilinear, it would be
.
- Is the map
defined by
bilinear? Answer
YES! Make sure you understand why! - Is the map
defined by
bilinear? Answer
Nope! It's not linear in either compoment. - Is the map
defined by
bilinear? Answer
YES! Hm, does this functionlook familiar to you? ooOOoo fOrEsHaDoWiNg
- Is the map
bilinear? Answer
Well, I guess it is... lol. - Come up with a bilinear map from
to
. Make it as simple/complex as you desire.
Not so bad eh? Let's move on to multilinear maps.
Definition: A function



Wait, isn't that basically the same exact definition?
Test Your Understanding
- Are all bilinear maps also multilinear? AnswerYeah, duh.
- Is the map
defined by
multilinear? (This is just the dot product of the input vectors) Answer
Yes! - Let
be multilinear, and let
be a linear transformation. Does it follow that
is multilinear? Answer
Certainly yes!
Multilinear maps aren't that fantastic. They really only start to shine once you give them one extra property...
Part 3: Alternating Multilinear Maps
Definition: A multilinear map

For example, consider the multilinear map


Test Your Understanding
- Suppose
is multilinear and alternating. If
, what is
? Answer
It should be.
- In general, let
be a permutation. If
, what is
? Answer
It is. Here,
is the sign of the permutation
, i.e. if
is an even permutation then the sign is
, and otherwise the sign is
.
- Prove that a multilinear map
is alternating iff
. That is: if two of the components are equal, then
comes out to zero. Solution
For simplicity I'll assume.
() Suppose
is alternating. Then e.g.
(by switching the first two components). It follows that
.
() Suppose
is zero whenever any two of its components match. Let's e.g. prove that
.
- Let
be multilinear and alternating. Prove that
. That is, I can add one component to another and nothing would change. Solution
This shouldn't be hard. If you're stuck, stare at the previous solution.
The above stuff is important, but this next result is even more important, so it gets the stage here. Let



THEOREM: If





Proof. I'm going to cheat and let








So, what's the point? Basically, for any









- Some of these terms use the same
, like
. By multilinearity, this is just
. But hey, two of those components match! So, since
is alternating, this is just zero. Terms that use the same basis element vanish!
- The rest of the terms use each
exactly once, like
. By multilinearity, this is just
. But hey, we know that
only differs from
by a factor of
, so we can write these terms in terms of
.



Corollary: If





You can see that alternating multilinear maps have some pretty neat properties!
...we're eventually going to get to determinants, I promise.
Part 4: Almost a Determinant...
Let

Let





Why should you believe me? I'll give you two reasons.
- If you remember that crazy determinant formula from before, or if you think hard about expansion by minors, you may realize that what a determinant does is multiply together one coordinate from each component in "every possible way", before adding them up. The sort of "expanding" that you do with multilinearity corresponds to this! (Don't worry if this doesn't catch on immediately, though this is a pretty important point! It might help to look at what's going on in the expanding I did earlier, and notice what terms stay alive... compare this to how you'd compute the determinant of
.)
- If you think about it, the "multilinearity rules" feel like they describe properties for area/volume scaling and increase. For example, doubling one dimension (corresponds to doubling the value of
on some basis vector, say
) should double the volume (corresponds to
).

Part 5: The Universal Property
You're probably wondering what the word "universal" means.
Layman's Definition: Something is "universal" if it looks like everything else.
Abstractatition's Definition: A universal object in a category is an initial object. Or something. Specifically in this context we're considering the category of multilinear maps with morphisms that are linear maps, I think. idk anymore help
Let's start simple, before the notation gets a bit heavy: Let's say I have a linear map from




















Not all such linear maps are universal. For example,

Let's now ramp it up. Suppose






You might be able to guess what's coming.
THEOREM: Let


By non-zero, I mean that the map doesn't just send everything to




Proof. Let



The strategy in this proof is to first figure out a formula for


Since












Hm, does it work? To show that it works, we need to prove that















Part 6: The Determinant Unmasked
DEFINITION: Let



















- Notice that this is much stronger than what we had in Part 4. I'm not forced to use the measly basis vectors
. No... this identity holds for ALL
!
- Note that our choice of
didn't really matter, as long as it was non-zero, multilinear and alternating. This is because all such maps are universal.
Part 7: All Hail Abstraction
THEOREM:

Proof. Take a non-zero alternating multilinear map




THEOREM: The determinant itself is a multilinear alternating map! In fact, it is the unique such map that sends the identity matrix/map to

Proof.
Take the standard basis:













By plugging in



We claim that














THEOREM: Adding one column to another does not change the determinant of a matrix.
Proof. The determinant is multilinear and alternating.

THEOREM: Multiplying a column by


Proof. The determinant is multilinear and alternating.

If you want, you can even recover that original horrible formula without too much hassle.
THEOREM:


Proof. Just show that


The only other major property left would probably be

Welp, that's it! Hopefully this post gets you to see the determinant in a new light.
This post has been edited 2 times. Last edited by greenturtle3141, Nov 26, 2021, 6:30 PM