User:Temperal/The Problem Solver's Resource4

< User:Temperal
Revision as of 22:56, 10 January 2009 by Temperal (talk | contribs) (Abstract Algebra: rmv)
Introduction | Other Tips and Tricks | Methods of Proof | You are currently viewing page 4.

Algebra

This is a collection of algebra laws and definitions. Obviously, there is WAY too much to cover here, but we'll try to give a good overview.

Elementary Algebra

Definitions

  • A polynomial is a function of the form

\[f(x)=a_nx^n+a_{n-1}x^{n-1}\ldots+a_0\], where $a_n\ne 0$, and $a_i$ are real numbers, and are called the coefficients.

  • A polynomial has degree $c$ if the highest exponent of a variable is $c$. The degree of polynomial $P$ is expressed as $\deg(P)$.
  • A quadratic equation is a polynomial of degree $2$. A cubic is of degree $3$. A quartic is of degree $4$. A quintic is of degree $5$.

Factor Theorem

Iff a polynomial $P(x)$ has roots $a,b,c,d,e,\ldots,z$, then $(x-a)(x-b)\ldots (x-z)=0$, and $(x-a),(x-b)\ldots (x-z)$ are all factors of $P(x)$.

Quadratic Formula

For a quadratic of form $ax^2+bx+c=0$, where $a,b,c$ are constants, the equation has roots $\frac{-b\pm\sqrt{b^2-4ac}}{2a}$

Fundamental Theorems of Algebra

  • Every polynomial not in the form $f(x)=c$ has at least one root, real or complex.
  • A polynomial of degree $n$ has exactly $n$ roots, real or complex.

Rational Root Theorem

Given a polynomial $f(x)$, with integer coefficients $a_i$, all rational roots are in the form $\frac{p}{q}$, where $|p|$ and $|q|$ are coprime natural numbers, $p|a_0$, and $q|a_n$.


Determinants

The determinant of a $2$ by $2$ (said to have order $2$) matrix $\left |\begin{matrix}a&b \\ c&d\end {matrix}\right|$ is $ad-bc$.

General Formula for the Determinant

Let $A$ be a square matrix of order $n$. Write $A = a_{ij}$, where $a_{ij}$ is the entry on the row $i$ and the column $j$, for $i=1,\cdots,n$ and $j=1,\cdots,n$. For any $i$ and $j$, set $A_{ij}$ (called the cofactors) to be the determinant of the square matrix of order $n-1$ obtained from $A$ by removing the row number $i$ and the column number $j$ multiplied by $(-1)^{i+j}$. Thus:

$\det(A) = \sum_{j=1}^{j=n} a_{ij} A_{ij}$

Cramer's Law

Consider a set of three linear equations (i.e. polynomials of degree one)

  • $ax+by+cz=d$
  • $ex+fy+gz=h$
  • $ix+jy+kz=l$

Let $D=\left|\begin{matrix}a&e&i\\b&f&j\\c&g&k\end{matrix}\right|$, $D_x=\left|\begin{matrix}d&h&1\\b&f&j\\c&g&k\end{matrix}\right|$, $D_y=\left|\begin{matrix}a&e&i\\d&h&l\\c&g&k\end{matrix}\right|$, $D_x=\left|\begin{matrix}a&e&i\\b&f&j\\d&h&l\end{matrix}\right|$ $x = \frac{D_x}{D}$, $y = \frac{D_y}{D}$, and $z = \frac{D_z}{D}$. This can be generalized to any number of linear equations.


Newton's Sums

Consider a polynomial $P(x)$ of degree $n$, Let $P(x)=0$ have roots $x_1,x_2,\ldots,x_n$. Define the following sums:

  • $S_1 = x_1 + x_2 + \cdots + x_n$
  • $S_2 = x_1^2 + x_2^2 + \cdots + x_n^2$
  • $\vdots$
  • $S_k = x_1^k + x_2^k + \cdots + x_n^k$
  • $\vdots$

The following holds:

  • $a_nS_1 + a_{n-1} = 0$
  • $a_nS_2 + a_{n-1}S_1 + 2a_{n-2}=0$
  • $a_nS_3 + a_{n-1}S_2 + a_{n-2}S_1 + 3a_{n-3}=0$
  • $\vdots$

Vieta's Sums

Let $P(x)$ be a polynomial of degree $n$, so $P(x)={a_n}x^n+{a_{n-1}}x^{n-1}+\cdots+{a_1}x+a_0$, where the coefficient of $x^{i}$ is ${a}_i$ and $a_n \neq 0$.

We have: \[a_n = a_n\] \[a_{n-1} = -a_n(r_1+r_2+\cdots+r_n)\] \[a_{n-2} = a_n(r_1r_2+r_1r_3+\cdots+r_{n-1}r_n)\] \[\vdots\] \[a_0 = (-1)^n a_n r_1r_2\cdots r_n\]


Back to page 3 | Continue to page 5