| Wedge |
by math_explorer, Sep 22, 2012, 7:31 AM
Some stuff that happened yesterday. This is even better than taking courses at a university. Whee.
Let
be elements of
where
. Column vectors with
real elements, that is. It's a confusing convention, not to mention typographical annoyance, that elements of
somehow end up with the components written vertically. However, it's kind of convenient when we want to paste column vectors together. Let
. This means pasting the column vectors together to make an
matrix (
rows
columns). This is confusing because it's sometimes not clear that
are column vectors, so just to be safe I went and bolded all my vectors as well as put tiny arrows on them which is more than either of the linear algebra textbooks I'm studying does (don't ask why there are two). Yay regexes. Normal letters are usually reals, uppercase letters are usually bigger matrices.
Seriously linear algebra notation is screwed up. One of my textbooks has
denote the
th column of a matrix and the other has
denote a matrix with its
th column deleted. They put the transpose sign in superscripts on opposite sides of the matrices. Rawr.
Anyway.
Consider the thing
, which can be visualized (...well, not for
(?)) as the hyperparallelogram
but is apparently a totally different thing in a totally weird vector space of things.
Anyway, we wonder how to compute the hyperarea of this hyperparallelogram.
To get a simple idea of what we're trying to do, here's the
case without any advanced stuff. We have two
-dimensional vectors
and
, and we want to consider the area of the parallelogram they bound. Well, geometrically this can be computed as
where
is the angle between the two vectors. And we can get at that angle because we know that the vectors' dot product
.
So. Let the vectors' coordinates be
and
.

Does this look familiar? It should if you've ever done a nontrivial inequality. Anyway, it's equivalent to
![\[ \sqrt{\sum_{i<j} (x_i y_j - x_j y_i)^2} \]](//latex.artofproblemsolving.com/8/f/1/8f1b4ff3ebeeb6c788fff2081951b442cc517ed0.png)
which is, interestingly,
![\[ \sqrt{\sum_{i<j} \begin{vmatrix} x_i & y_i \\ x_j & y_j \end{vmatrix}^2 } \]](//latex.artofproblemsolving.com/b/a/8/ba8bb5105808252301f322467c3d7bbc9ff2ae4b.png)
Those vertical bars denote the determinant, which is a really important scalar function of square matrices, if you didn't know. I don't know what sort of background I expect people who read this to have. Whatever.
Hmm. This is interesting because it suggests a general formula that even works in the other cases. Maybe we just take all combinations of
rows and compute the squares of these determinants, sum them up, and take the square root of everything. Compare with 
![\[ \sqrt{\sum_{i} |x_i|^2} \]](//latex.artofproblemsolving.com/5/5/a/55a34d3ad099d8b753a418ce978a7d80834df414.png)
and
![\[ \sqrt{\begin{vmatrix} x_i & y_i & z_i \\ x_j & y_j & z_j \\ x_k & y_k & z_k \end{vmatrix}^2 } \]](//latex.artofproblemsolving.com/1/4/6/146ce891fbdcc87f3494cd61142d29eba4c8e0d4.png)
where the square root of the square of course just works out to be taking the absolute value.
Anyway, the only way we know how to compute hyperarea of hyperparallelograms at this stage turns out to be the determinant, except we can only take determinants of hyperparallelograms that are stuck in the Euclidean space equal to their own dimension. Our vectors form an
matrix, unfortunately.
So instead let's take the
-dimensional subspace
that our vectors span (assuming they don't have any linear dependencies, because if they did then the hyperparallelogram is degenerate and has area trivially
) and try to pretend it's the normal
we're used to. We can do this by taking a random orthonormal basis
of it, and express each of our vectors
as a linear combination of the basis vectors.
In other words,
for some
for each
. And since the basis is orthonormal, the hyperarea we want to compute can be computed as the determinant of the matrix with the coordinate vectors
.
Our task is now to compute
. This is hard, but it can be LaTeXed, which means now we have something to work on. Our only lead on these
is the way we defined it: we have
, but it's not useful because we have to somehow get rid of the
next to it, and that's not even a square matrix.
We like working in
more. Luckily, there's an obvious way to create an
determinant that's equal to what we want: augment an
identity matrix diagonally onto it. So we want to compute the determinant of
![\[ D = \det\left( \begin{bmatrix} C & \textbf{0} \\ \textbf{0} & I_{n-m} \end{bmatrix} \right) \]](//latex.artofproblemsolving.com/7/0/2/702f1d19bc1bd22315a40cdefafd198d60069279.png)
(
being the
identity matrix)
We can "obviously" find an orthonormal basis
of the whole
that includes our orthonormal basis of
. So let
.
How does the property of orthonormal-basis-ness translate into in matrix language? We have
and
for
. The matrix multiplication that will involve these dot products is
. This is not the notation either of my textbooks uses but
is
's transpose.
Entry-by-entry,
. So
. Hmm. Since determinants are multiplicative, we immediately have
. Also, we know
and we can compute
from that:
. Let this matrix be
. Is
a good name for a matrix? I have no idea. The point is that the area we want is
.
Okay so we've gotten rid of the mysterious coordinate vectors
; unfortunately we really don't know much more about the remnants of our orthonormal basis. However since we really only want the determinant of
we can make the matrix smash with itself this way.
Consider
. The elements of this are the dot products of
's column vectors with themselves. Since the
are orthogonal to each other and to each of the
...

Taking the determinant one final time, remembering that transposing keeps it invariant, we see
so a way to compute our final solution is
.
And now it seems too simple to require so much work. What the heck. College math is weird.
Let





![$V = [\vec{\textbf{v}}_1, \ldots, \vec{\textbf{v}}_m]$](http://latex.artofproblemsolving.com/8/f/a/8fa731fc86a0b3ffae90374ba357570917a80e8a.png)




Seriously linear algebra notation is screwed up. One of my textbooks has




Anyway.
Consider the thing


![$\left\{\sum_{i=1}^{m} t_i\vec{\textbf{v}}_i \,\middle|\, t_i \in [0,1], i = 1, 2, \ldots, m \right\}$](http://latex.artofproblemsolving.com/e/c/2/ec27931aa34d69b45ba57dffd544064ec8274259.png)
Anyway, we wonder how to compute the hyperarea of this hyperparallelogram.
To get a simple idea of what we're trying to do, here's the




![\[ |\vec{\textbf{v}}_1| |\vec{\textbf{v}}_2|\sin \theta \]](http://latex.artofproblemsolving.com/9/6/b/96b0678bae8a935fbef075d587441a0d142eb8c5.png)

![\[ \vec{\textbf{v}}_1 \cdot \vec{\textbf{v}}_2 = |\vec{\textbf{v}}_1| |\vec{\textbf{v}}_2|\cos \theta \]](http://latex.artofproblemsolving.com/b/7/e/b7ea1a252c83ebe15452453bfe76d16da1520e3b.png)
So. Let the vectors' coordinates be



Does this look familiar? It should if you've ever done a nontrivial inequality. Anyway, it's equivalent to
![\[ \sqrt{\sum_{i<j} (x_i y_j - x_j y_i)^2} \]](http://latex.artofproblemsolving.com/8/f/1/8f1b4ff3ebeeb6c788fff2081951b442cc517ed0.png)
which is, interestingly,
![\[ \sqrt{\sum_{i<j} \begin{vmatrix} x_i & y_i \\ x_j & y_j \end{vmatrix}^2 } \]](http://latex.artofproblemsolving.com/b/a/8/ba8bb5105808252301f322467c3d7bbc9ff2ae4b.png)
Those vertical bars denote the determinant, which is a really important scalar function of square matrices, if you didn't know. I don't know what sort of background I expect people who read this to have. Whatever.
Hmm. This is interesting because it suggests a general formula that even works in the other cases. Maybe we just take all combinations of


![\[ \sqrt{\sum_{i} |x_i|^2} \]](http://latex.artofproblemsolving.com/5/5/a/55a34d3ad099d8b753a418ce978a7d80834df414.png)
and

![\[ \sqrt{\begin{vmatrix} x_i & y_i & z_i \\ x_j & y_j & z_j \\ x_k & y_k & z_k \end{vmatrix}^2 } \]](http://latex.artofproblemsolving.com/1/4/6/146ce891fbdcc87f3494cd61142d29eba4c8e0d4.png)
where the square root of the square of course just works out to be taking the absolute value.
Anyway, the only way we know how to compute hyperarea of hyperparallelograms at this stage turns out to be the determinant, except we can only take determinants of hyperparallelograms that are stuck in the Euclidean space equal to their own dimension. Our vectors form an

So instead let's take the






In other words,
![$\vec{\textbf{v}}_i = [\vec{\textbf{w}}_1, \vec{\textbf{w}}_2, \ldots, \vec{\textbf{w}}_m]\vec{\textbf{c}}_i$](http://latex.artofproblemsolving.com/9/2/9/929d426e4cba2c1de0a0da047ca7e4c5cfb15232.png)



Our task is now to compute
![$\det(C := [\vec{\textbf{c}}_1, \ldots, \vec{\textbf{c}}_m])$](http://latex.artofproblemsolving.com/d/4/e/d4ea26d5a66e176a0f083ca4c2136fb3aafc879c.png)

![$V = [\vec{\textbf{w}}_1, \ldots, \vec{\textbf{w}}_m]C$](http://latex.artofproblemsolving.com/2/e/0/2e0823d69c20dc0af5ecb53816ecfe42d5979560.png)
![$[\vec{\textbf{w}}_1, \vec{\textbf{w}}_2, \ldots, \vec{\textbf{w}}_m]$](http://latex.artofproblemsolving.com/8/9/b/89bfda1c9cb50365fe47cba9c1f046be37ed488b.png)
We like working in



![\[ D = \det\left( \begin{bmatrix} C & \textbf{0} \\ \textbf{0} & I_{n-m} \end{bmatrix} \right) \]](http://latex.artofproblemsolving.com/7/0/2/702f1d19bc1bd22315a40cdefafd198d60069279.png)
(


We can "obviously" find an orthonormal basis



![$W = [\vec{\textbf{w}}_1, \vec{\textbf{w}}_2, \ldots, \vec{\textbf{w}}_n]$](http://latex.artofproblemsolving.com/9/8/c/98c9d43a383ac8af471e7d55c5abeb1f48a6441e.png)
How does the property of orthonormal-basis-ness translate into in matrix language? We have






Entry-by-entry,





![$WD = [\vec{\textbf{v}}_1, \ldots, \vec{\textbf{v}}_m, \vec{\textbf{w}}_{m+1}, \ldots, \vec{\textbf{w}}_n]$](http://latex.artofproblemsolving.com/9/5/b/95bd408ef64a05a94dbf4db5084f1b251f35d3b5.png)



Okay so we've gotten rid of the mysterious coordinate vectors


Consider





Taking the determinant one final time, remembering that transposing keeps it invariant, we see


And now it seems too simple to require so much work. What the heck. College math is weird.
This post has been edited 2 times. Last edited by math_explorer, May 13, 2015, 1:25 AM