JulienBeaulieu
  • Introduction
  • Sciences
    • Math
      • Probability
        • Bayes Rule
        • Binomial distribution
        • Conditional Probability
      • Statistics
        • Descriptive Statistics
        • Inferential Statistics
          • Normal Distributions
          • Sampling Distributions
          • Confidence Intervals
          • Hypothesis Testing
          • AB Testing
        • Simple Linear Regression
        • Multiple Linear Regression
          • Statistical learning course
          • Model Assumptions And How To Address Each
        • Logistic Regression
      • Calculus
        • The big picture of Calculus
          • Derivatives
          • 2nd derivatives
          • The exponential e^x
        • Calculus
        • Gradient
      • Linear Algebra
        • Matrices
          • Matrix Multiplication
          • Inverses and Transpose and permutations
        • Vector Space and subspaces
        • Orthogonality
          • Orthogonal Sets
          • Projections
          • Least Squares
        • Gaussian Elimination
    • Programming
      • Command Line
      • Git & GitHub
      • Latex
      • Linear Algebra
        • Element-wise operations, Multiplication Transpose
      • Encodings and Character Sets
      • Uncategorized
      • Navigating Your Working Directory and File I/O
      • Python
        • Problem Solving
        • Strings
        • Lists & Dictionaries
        • Storing Data
        • HTTP Requests
      • SQL
        • Basic Statements
        • Entity Relationship Diagram
      • Jupyter Notebooks
      • Data Analysis
        • Data Visualization
          • Data Viz Cheat Sheet
          • Explanatory Analysis
          • Univariate Exploration of Data
            • Bar Chart
            • Pie Charts
            • Histograms
            • Kernel Density Estimation
            • Figures, Axes, and Subplots
            • Choosing a Plot for Discrete Data
            • Scales and Transformations (Log)
          • Bivariate Exploration of Data
            • Scatterplots
            • Overplotting, Transparency, and Jitter
            • Heatmaps
            • Violin & Box Plots
            • Categorical Variable Analysis
            • Faceting
            • Line Plots
            • Adapted Bar Charts
            • Q-Q, Swarm, Rug, Strip, Stacked, and Rigeline Plots
          • Multivariate Exploration of Data
            • Non-Positional Encodings for Third Variables
            • Color Palettes
            • Faceting for Multivariate Data
            • Plot and Correlation Matrices
            • Other Adaptations of Bivariate PLots
            • Feature Engineering for Data Viz
        • Python - Cheat Sheet
    • Machine Learning
      • Courses
        • Practical Deep learning for coders
          • Convolutional Neural Networks
            • Image Restauration
            • U-net
          • Lesson 1
          • Lesson 2
          • Lesson 3
          • Lesson 4 NLP, Collaborative filtering, Embeddings
          • Lesson 5 - Backprop, Accelerated SGD
          • Tabular data
        • Fast.ai - Intro to ML
          • Neural Nets
          • Business Applications
          • Class 1 & 2 - Random Forests
          • Lessons 3 & 4
      • Unsupervised Learning
        • Dimensionality Reduction
          • Independant Component Analysis
          • Random Projection
          • Principal Component Analysis
        • K-Means
        • Hierarchical Clustering
        • DBSCAN
        • Gaussian Mixture Model Clustering
        • Cluster Validation
      • Preprocessing
      • Machine Learning Overview
        • Confusion Matrix
      • Linear Regression
        • Feature Scaling and Normalization
        • Regularization
        • Polynomial Regression
        • Error functions
      • Decision Trees
      • Support Vector Machines
      • Training and Tuning
      • Model Evaluation Metrics
      • NLP
      • Neural Networks
        • Perceptron Algorithm
        • Multilayer Perceptron
        • Neural Network Architecture
        • Gradient Descent
        • Backpropagation
        • Training Neural Networks
  • Business
    • Analytics
      • KPIs for a Website
  • Books
    • Statistics
      • Practice Statistics for Data Science
        • Exploring Binary and Categorical Data
        • Data and Sampling Distributions
        • Statistical Experiments and Significance Testing
        • Regression and Prediction
        • Classification
        • Correlation
    • Pragmatic Thinking and Learning
      • Untitled
    • A Mind For Numbers: How to Excel at Math and Science
      • Focused and diffuse mode
      • Procrastination
      • Working memory and long term memory
        • Chunking
      • Importance of sleeping
      • Q&A with Terrence Sejnowski
      • Illusions of competence
      • Seeing the bigger picture
        • The value of a Library of Chunks
        • Overlearning
Powered by GitBook
On this page
  • Solving a system of equations by elimination
  • Coefficient matrix
  • Identity Matrix
  • Trace
  • Transpose
  • Determinant
  • Inverting Matrices

Was this helpful?

  1. Sciences
  2. Math
  3. Linear Algebra

Matrices

PreviousLinear AlgebraNextMatrix Multiplication

Last updated 6 years ago

Was this helpful?

Linear algebra is an area of study in mathematics that concerns itself primarily with the study of and the between them.

Fore more details see .

Augmented matrix = you tagged something onto the matrix. For Ax = b. When you add the vector b, A becomes

Solving a system of equations by elimination

You want to change A into U, where U is in the form of having 0 at the bottom triangle, because is it easier to solve.

(multiplying the diagonal gives you the determinant).

Back Substitution: when you apply to the vector b what you did to the matrix A.

Once you've done the elimination, A becomes U and B become c. Ax = b -> Ux = c.

Your final systems will therefore be:

x +2y + z = 2,

2y - 2z =6,

5z = -10

Coefficient matrix

The coefficients = the coefficient matrix. (x, y, z) = column vector.

Ax = v. We are looking for a vector X, which, after the transformation A, equals v.

How can you solve this? You can multiply each side by the inverse matrix to have x = inverseA x v.

Identity Matrix

Trace

Transpose

Also

Let's finish the topic of elimination.

Ex: 2 x 2 elimnation. Let A be a matrix as below where I can do elimination, but no pivots. I want to get from A to U to solve A. But then I want to know how is A related to U where there is a matrix L where A = LU. How do you get there?

First, to solve A and get U (that is then easy to solve), I multiply by my elementary matrix at the position 2, 1 (E(2,1)) because that is how I get a 0 in position 2,1. Therefore we get:

Then to get A = L U: you need to multiply E, (2,1) by the inverse which becomes L (-4 becomes 4).

Where L is the Lower triangle, and U is the Upper triangle.

Determinant

The factor by which a linear transformation changes any area, is called the determinant of that transformation. You can also have negative determinants which means that you invert the orientation of space.

Formally, the determinant is a function det from the set of square matrices to the set of real numbers that satisfies 3 important properties:

Unfortunately, these calculations can get quite tedious; already for 3x3 matrices, the formula is too long to memorize in practice.

Example:

Inverting Matrices

The following is true for square matrices.

A square matrix won't have an inverse, if can find a non 0 vector x with Ax = 0 (also if det A = 0) Because I can multiply both sides by A-1, and this leaves us with x = 0. But if we found a non 0 vector where Ax = 0, that is impossible. Therefore, some matrices don't have inverses.

Why Do We Need an Inverse?

Because with matrices we don't divide! Seriously, there is no concept of dividing by a matrix. But we can multiply by an inverse, which achieves the same thing.

Say we want to find matrix X, and we know matrix A and B:

XA = B

It would be nice to divide both sides by A (to get X=B/A), but remember we can't divide.

But what if we multiply both sides by A-1 ?

XAA-1 = BA-1

And we know that AA-1 = I, so:

XI = BA-1

We can remove I (for the same reason we can remove "1" from 1x = ab for numbers):

X = BA-1

And we have our answer (assuming we can calculate A-1)

Inverse of bigger matrices - see the Gauss Jordan method.

https://www.mathsisfun.com/algebra/matrix-inverse-row-operations-gauss-jordan.html
vector spaces
linear transformations
https://brilliant.org/wiki/matrices/