Matrices
Last updated
Last updated
Linear algebra is an area of study in mathematics that concerns itself primarily with the study of vector spaces and the linear transformations between them.
Fore more details see https://brilliant.org/wiki/matrices/.
Augmented matrix = you tagged something onto the matrix. For Ax = b. When you add the vector b, A becomes
You want to change A into U, where U is in the form of having 0 at the bottom triangle, because is it easier to solve.
(multiplying the diagonal gives you the determinant).
Back Substitution: when you apply to the vector b what you did to the matrix A.
Once you've done the elimination, A becomes U and B become c. Ax = b -> Ux = c.
Your final systems will therefore be:
x +2y + z = 2,
2y - 2z =6,
5z = -10
The coefficients = the coefficient matrix. (x, y, z) = column vector.
Ax = v. We are looking for a vector X, which, after the transformation A, equals v.
How can you solve this? You can multiply each side by the inverse matrix to have x = inverseA x v.
Also
Let's finish the topic of elimination.
Ex: 2 x 2 elimnation. Let A be a matrix as below where I can do elimination, but no pivots. I want to get from A to U to solve A. But then I want to know how is A related to U where there is a matrix L where A = LU. How do you get there?
First, to solve A and get U (that is then easy to solve), I multiply by my elementary matrix at the position 2, 1 (E(2,1)) because that is how I get a 0 in position 2,1. Therefore we get:
Then to get A = L U: you need to multiply E, (2,1) by the inverse which becomes L (-4 becomes 4).
Where L is the Lower triangle, and U is the Upper triangle.
The factor by which a linear transformation changes any area, is called the determinant of that transformation. You can also have negative determinants which means that you invert the orientation of space.
Formally, the determinant is a function det from the set of square matrices to the set of real numbers that satisfies 3 important properties:
Unfortunately, these calculations can get quite tedious; already for 3x3 matrices, the formula is too long to memorize in practice.
Example:
The following is true for square matrices.
A square matrix won't have an inverse, if can find a non 0 vector x with Ax = 0 (also if det A = 0) Because I can multiply both sides by A-1, and this leaves us with x = 0. But if we found a non 0 vector where Ax = 0, that is impossible. Therefore, some matrices don't have inverses.
Why Do We Need an Inverse?
Because with matrices we don't divide! Seriously, there is no concept of dividing by a matrix. But we can multiply by an inverse, which achieves the same thing.
Say we want to find matrix X, and we know matrix A and B:
XA = B
It would be nice to divide both sides by A (to get X=B/A), but remember we can't divide.
But what if we multiply both sides by A-1 ?
XAA-1 = BA-1
And we know that AA-1 = I, so:
XI = BA-1
We can remove I (for the same reason we can remove "1" from 1x = ab for numbers):
X = BA-1
And we have our answer (assuming we can calculate A-1)
Inverse of bigger matrices - see the Gauss Jordan method. https://www.mathsisfun.com/algebra/matrix-inverse-row-operations-gauss-jordan.html