# MATH141

1.1 Systems of Linear Equations:

• A system of linear equations can be represented by an augmented matrix.
• An elementary row operation is one of the following:
• Interchanging two rows.
• Multiplying a row by a nonzero constant.
• Adding a multiple of one row to another row.
• A system of linear equations is consistent if it has at least one solution.
• A system of linear equations is inconsistent if it has no solution.

1.2 Row Reduction and Echelon Forms:

• A matrix is in row echelon form if:
• All rows that contain only zeros are at the bottom of the matrix.
• The first nonzero entry in each nonzero row is called a pivot.
• The pivot in each row is to the right of the pivot in the row above it.
• A matrix is in reduced row echelon form if:
• It is in row echelon form.
• Each pivot is 1.
• Each pivot is the only nonzero entry in its column.
• A system of linear equations is equivalent to an augmented matrix in reduced row echelon form.
• A pivot column in an augmented matrix corresponds to a leading variable in the system of linear equations.

1.3 Vector Equations:

• A vector in n-dimensional space is an ordered list of n real numbers.
• The zero vector is the vector consisting of all zeros.
• Scalar multiplication of a vector by a scalar k multiplies each component of the vector by k.
• The sum of two vectors is a vector obtained by adding the corresponding components of the vectors.

1.4 The Matrix Equation Ax = b:

• A matrix is a rectangular array of numbers.
• The product of a matrix A and a vector x is a linear combination of the columns of A.
• The system of linear equations Ax = b is consistent if and only if b is a linear combination of the columns of A.
• The inverse of a square matrix A is a matrix A^-1 such that AA^-1 = A^-1A = I, where I is the identity matrix.

2.1 Matrix Operations:

• The sum of two matrices of the same size is a matrix obtained by adding the corresponding entries of the matrices.
• Scalar multiplication of a matrix by a scalar k multiplies each entry of the matrix by k.
• The product of two matrices A and B is a matrix obtained by multiplying the rows of A by the columns of B.
• The associative property holds for matrix multiplication: (AB)C = A(BC).
• The distributive property holds for matrix multiplication: (A+B)C = AC+BC.

2.2 Matrix Algebra:

• The transpose of a matrix A is a matrix AT obtained by interchanging the rows and columns of A.
• A symmetric matrix is a matrix A such that AT = A.
• A skew-symmetric matrix is a matrix A such that AT = -A.
• The determinant of a 2×2 matrix A is det(A) = ad – bc.
• The determinant of an n x n matrix A can be computed by cofactor expansion along any row or column.
• A matrix A is invertible if and only if det(A) ≠ 0.

2.3 Elementary Matrices and a Method for Finding A^-1:

• An elementary matrix is a square matrix obtained from the identity matrix by performing a single elementary row operation.
• If E is an elementary matrix, then EA performs the same row operation on A as E performs on I.
• If A can be transformed into I by a sequence of elementary row operations, then the product of the corresponding elementary matrices is A^-1.

3.2 Diagonalization:

• A matrix A is diagonalizable if there exists an invertible matrix P and a diagonal matrix D such that A = PDP^-1.
• If A is diagonalizable, then D has the eigenvalues of A on its diagonal and P has the corresponding eigenvectors as its columns.
• A matrix A is diagonalizable if and only if it has n linearly independent eigenvectors.
• A matrix A is diagonalizable if and only if there exists a basis of R^n consisting of eigenvectors of A.

4.1 Vector Spaces and Subspaces:

• A vector space is a set of vectors closed under addition and scalar multiplication.
• A subspace of a vector space is a subset that is itself a vector space under the same operations.
• The span of a set of vectors is the set of all linear combinations of the vectors.
• A set of vectors is linearly independent if no vector in the set can be expressed as a linear combination of the others.

4.2 Null Spaces, Column Spaces, and Linear Transformations:

• The null space of a matrix A is the set of all solutions x to the homogeneous equation Ax = 0.
• The column space of a matrix A is the span of its columns.
• The rank of a matrix A is the dimension of its column space.
• A linear transformation T from R^n to R^m is a function that preserves vector addition and scalar multiplication.
• The matrix of a linear transformation T with respect to the standard bases of R^n and R^m is the matrix whose columns are the images of the basis vectors.

5.1 Inner Product, Length, and Orthogonality:

• An inner product on a vector space V is a function that takes two vectors in V and returns a scalar.
• The dot product of two vectors in R^n is an example of an inner product.
• The length of a vector x is ||x|| = sqrt(x1^2 + x2^2 + … + xn^2).
• Two vectors are orthogonal if their dot product is zero.
• An orthogonal set of vectors is a set of vectors in which every pair is orthogonal.
• An orthonormal set of vectors is an orthogonal set of unit vectors.

5.2 Orthogonal Sets:

• Gram-Schmidt Process: Given a basis {v1, v2, …, vn} for a subspace of R^n, we can construct an orthonormal basis {u1, u2, …, un} using the following steps:
• Set u1 = v1/||v1||.
• For i = 2 to n, let ui = vi – proj_{u1}(vi) – proj_{u2}(vi) – … – proj_{ui-1}(vi), where proj_{u}(v) = ((v.u)/(u.u))u.
• Normalize each ui to obtain an orthonormal set.

6.1 The Gram-Schmidt Process and Orthogonal Bases:

• An orthogonal basis for a subspace is a basis that is also an orthogonal set.
• Any linearly independent set of vectors can be converted into an orthogonal basis by applying the Gram-Schmidt process.
• The orthogonal complement of a subspace V of R^n is the set V⊥ = {x ∈ R^n : x.v = 0 for all v ∈ V}.
• The orthogonal complement of a subspace is itself a subspace.

6.2 Projections:

• The projection of a vector x onto a nonzero vector u is proj_{u}(x) = ((x.u)/(u.u))u.
• The projection of a vector x onto a subspace V is proj_{V}(x) = ∑(x.v/u.v^2)u, where the sum is taken over an orthonormal basis for V.
• The projection matrix P onto a subspace V with orthonormal basis {u1, u2, …, um} is P = [u1|u2|...|um][u1|u2|...|um]T.

7.1 Orthogonal Diagonalization:

• A matrix A is orthogonally diagonalizable if there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ^-1.
• If A is orthogonally diagonalizable, then the columns of Q are an orthonormal basis of R^n consisting of eigenvectors of A.
• If A is a real symmetric matrix, then A is orthogonally diagonalizable.
• If A is an n x n matrix with n distinct eigenvalues, then A is orthogonally diagonalizable.