Recap: Linear Algebra

Chapter 4 Determinants

 

Chapter 4 Determinants

1. Uses of Determinants

  • Invertibility

If |A|=0, then A is singular. If |A|0, then A is invertible.

  • Volume

The determinant of A equals the volume of a box in n-dimensional space. The edges of the box come from the rows of A. The columns of A would give an entirely different box with the same volume.

  • Pivots

The determinant gives a formula for each pivot. From the formula determinant=±(product of the pivots), it follows that regardless of the order of elimination, the product of the pivots remains the same apart from the sign.

  • Dependence

The determinant measures the dependence of A1b on each element of b.

2. Properties of Determinants

  • The determinant of the identity matrix is 1.
  • The determinant changes sign when two rows are exchanged.

The determinant of every permutation matrix is |P|=±1.

  • The determinant depends linearly on the first row.
|a+ab+bcd|=|abcd|+|abcd|
|tatbcd|=t|abcd|

 

  • If two rows of A are equal, then |A|=0.
  • Subtracting a multiple of one row from another row leaves the same determinant.
|alcbldcd|+|abcd|

 

The usual elimination steps do not affect the determinant.

  • If A has a row of zeros, then |A|=0 .
  • *If A is triangular, then |A| is the product of the diagonal entries.
  • The determinant of AB is the product of detA times detB.
|A||B|=|AB|
|A1|=1|A|
  • The transpose of A has the same determinant as A itself.
|A|=|AT|
  • LDU factorization.
PA=LDU
|A|=±|P||A|=±|L||D||U|=±|D|

3. Applications of Determinants

  • Computation of A1.
[abcd]1=1|A|[C11C12C21C22]=1adbc[dbca]
  • Cramer's Rule.

Chapter 5 Eigenvalues and Eigenvectors

1. Introduction

The eigenvalues are the most important feature of practically any dynamic system. Until now we have focused on the problem Ax=b, now we consider the new problem Ax=λx. This will still be solved by simplifying a matrix, but the basic step is no longer to subtract a multiple of one row from another: Elimination changes the eigenvalues.

(1) Solution of Ax=λx

This is a nonlinear equation since both A and λ are unknown. We first discover λ:

(AλI)x=0.

The vector x is in the nullspace of AλI.

The number λ is chosen so that AλI has a nullspace.

Of course, every matrix has a nullspace, but we want a nonzero eigenvector x. The vector x=0 satisfies Ax=λx, but it is useless in solving differential equations. We are interested only in those particular values λ for which there is a nonzero eigenvector x. That is, the nullspace of AλI must contain vectors other than 0. In short, AλI must be singular.

|Aλx|=0

Each λ is associated with eigenvectors x:

(AλI)x=0 or Ax=λx

(2) Checks on Eigenvalues

The sum of eigenvalues equals the sum of the diagonal entries:

λ1+...+λn=a11+...+ann=trace(A)

Furthermore, the product of eigenvalues equals the determinant of A.

From example** of eigenvalues, the diagonal entries and the eigenvalues are the same only in triangular matrices. Normally, they are completely different.

(3) Examples

  • Everything is clear when A is a diagonal matrix:
A=[3002] has λ1=3 with x1=[30],λ2=3 with x1=[02].

A acts like a multiple of the identity on each eigenvector:

Ax1=3x1 and Ax2=2x2.

The action of A is determined by its eigenvectors and eigenvalues.

  • The eigenvalues of a projection matrix are 1 or 0.
P=[12121212] has λ1=1 with x1=[11],λ2=0 with x1=[11].

A zero eigenvalue signals that A is singular (not invertible); its determinant is zero. Invertible matrices have all λ0.

  • **The eigenvalues are on the main diagonal when A is triangular:
A=[14503460012]
|AλI|=|1λ45034λ60012λ|=(1λ)(34λ)(12λ)

This follows from property* of determinants.

The eigenvalues are λ=1, λ=34 and λ=12, which are the diagonal entries of A.

2. Diagonalization of a Matrix

(1) Eigenvectors Diagonalize a Matrix

Suppose the n×n matrix A has n linearly independent eigenvectors. If these vectors are the columns of a matrix S, then S1AS is a diagonal matrix Λ. The eigenvalues of A are on the diagonal of Λ:

S1AS=Λ=[λ1λ2..λn]

We call S the "eigenvector matrix" and Λ the "eigenvalue matrix".

Proof.

AS=A[|||x1x2...xn|||]=[|||λ1x1λ2x2...λnxn|||]

We split the matrix AS into product SΛ:

[|||λ1x1λ2x2...λnxn|||]=[|||x1x2...xn|||][λ1λ2..λn]

Therefore, AS=SΛ. S is invertible, because its columns are assumed to be independent.

(2) Remarks

  • If a matrix has no repeated eigenvalues, then its eigenvectors are automatically independent.

Any matrix with distinct eigenvalues can be diagonalized.

  • The diagonalizing matrix S is not unique.
  • The order of the eigenvectors in S and the eigenvalues in Λ is automatically the same.
  • Not all matrices possess n linearly independent eigenvectors.

Not all matrices are diagonalizable.

Diaginalizability of A depends on enough eigenvectors. (n independent eigenvectors)

Invertibility of A depends on nonzero eigenvalues. (no zero eigenvalues)

  • There is no connection between diagonalizability and invertibility.
  • Diagonalization can fail only if there are repeated eigenvalues.

But it does not always fail. A=I has repeated eigenvalues 1,1,...,1 but it is already diagonal.

3. Power and Products

  • The eigenvalues of Ak has the same eigenvectors as A, and eigenvalues λ1k,λ2k,...,λnk.
A2x=Aλx=λAx=λ2x. (careful!)
  • If A is invertible, the eigenvalues of A1 are 1/λi.

Chapter 6 Positive Definite Matrices

1. Minima, Maxima and Saddle Points

 

留下评论

您的电子邮箱地址不会被公开。 必填项已用*标注

此站点使用Akismet来减少垃圾评论。了解我们如何处理您的评论数据