How can we help?

You can also find more resources in our Help Center.

linear equation

An equation that can be written as a1x1 + a2x2 + ... = b; a1, a2, etc. are real or complex numbers known in advance

consistent system

Has one or infinitely many solutions

inconsistent system

Has no solution

leading entry

Leftmost non-zero entry in a non-zero row

Echelon form

1. All nonzero rows are above any all zero rows; 2. Each leading entry is in a column to the right of the previous leading entry; 3. All entries below a leading entry in its column are zeros

Reduced Echelon Form

Same as echelon form, except all leading entries are 1; each leading 1 is the only non-zero entry in its row; there is only one unique reduced echelon form for every matrix

Span

the collection of all vectors in R^n that can be written as c1v1 + c2v2 + ... (where c1, c2, etc. are constants)

Ax = b

1. For each b in R^n, Ax = b has a solution; 2. Each b is a linear combination of A; 3. The columns of A span R^n; 4. A has a pivot position in each row

pivot position

A position in the original matrix that corresponds to a leading 1 in a reduced echelon matrix

pivot column

A column that contains a pivot position

homogeneous

A system that can be written as Ax = 0; the x = 0 solution is a TRIVIAL solution

independent

If only the trivial solution exists for a linear equation; the columns of A are independent if only the trivial solution exists

dependent

If non-zero weights that satisfy the equation exist; if there are more vectors than there are entries

transformation

assigns each vector x in R^n a vector T(x) in R^m

Matrix multiplication warnings

1. AB != BA ; 2. If AB = AC, B does not necessarily equal C; 3. If AB = 0, it cannot be concluded that either A or B is equal to 0

Transposition

flips rows and columns

Properties of transposition

1. (A^T)^T = A; 2. (A+B)^T = A^T + B^T; 3. (rA)^T = r**A^T; 4. (AB)^T = B^T**A^T

Invertibility rules

1. If A is invertible, (A^-1)^-1 = A; 2. (AB)^-1 = B^-1 * A^-1; 3. (A^T)^-1 = (A^-1)^T

Invertible Matrix Theorem (either all of them are true or all are false)

A is invertible; A is row equivalent to I; A has n pivot columns; Ax = 0 has only the trivial solution; The columns of A for a linearly independent set; The transformation x --> Ax is one to one; Ax = b has at least one solution for each b in R^n; The columns of A span R^n; x --> Ax maps R^n onto each R^m; there is an n x n matrix C such that CA = I; there is a matrix such that AD = I; A^T is invertible; The columns of A form a basis of R^n; Col A = R^n; dim Col A = n; rank A = n; Nul A = [0]; dim Nul A = 0

Column Row Expansion of AB

col1Arow1B + ...

LU Factorization

1. Ly = b; Ux = y; 2. Reduce A to echelon form; 3. Place values in L that, by the same steps, would reduce it to I

Leontief input-output model

x = Cx + d

Subspaces

1. The zero vector is in H; 2. For u and v in H, u + v is also in H; 3. For u in H, cu is also in H (c is a constant)

Column space

Set of all the linear combinations of the columns of A

Null space

Set of all solution to Ax = 0

Basis

A linearly independent set in H that spans H; the pivot columns of A form a basis for A's column space

Dimension

The number of vectors in any basis of H; the zero subspace's dimension is 0

rank

The dimension of the column space

one-to-one

A transformation that assigns a vector y in R^m for each x in R^n; there's a pivot in every column

onto

consistent for any b; pivots in all rows

inner product

a matrix product u^Tv or u . v where u and v are vectors; if U . V = 0, u and v are orthogonal

orthogonal component

1. x is in W' if x is perpendicular to every vector that spans W; 2. W' is a subspace of R^n

orthogonal set

A set of vectors where Ui . Uj = 0 (and i != j); if S is an orthogonal set, S is linearly independent and a basis of the subspace spanned by S

orthonormal

An orthogonal set of unit vectors