JulienBeaulieu
  • Introduction
  • Sciences
    • Math
      • Probability
        • Bayes Rule
        • Binomial distribution
        • Conditional Probability
      • Statistics
        • Descriptive Statistics
        • Inferential Statistics
          • Normal Distributions
          • Sampling Distributions
          • Confidence Intervals
          • Hypothesis Testing
          • AB Testing
        • Simple Linear Regression
        • Multiple Linear Regression
          • Statistical learning course
          • Model Assumptions And How To Address Each
        • Logistic Regression
      • Calculus
        • The big picture of Calculus
          • Derivatives
          • 2nd derivatives
          • The exponential e^x
        • Calculus
        • Gradient
      • Linear Algebra
        • Matrices
          • Matrix Multiplication
          • Inverses and Transpose and permutations
        • Vector Space and subspaces
        • Orthogonality
          • Orthogonal Sets
          • Projections
          • Least Squares
        • Gaussian Elimination
    • Programming
      • Command Line
      • Git & GitHub
      • Latex
      • Linear Algebra
        • Element-wise operations, Multiplication Transpose
      • Encodings and Character Sets
      • Uncategorized
      • Navigating Your Working Directory and File I/O
      • Python
        • Problem Solving
        • Strings
        • Lists & Dictionaries
        • Storing Data
        • HTTP Requests
      • SQL
        • Basic Statements
        • Entity Relationship Diagram
      • Jupyter Notebooks
      • Data Analysis
        • Data Visualization
          • Data Viz Cheat Sheet
          • Explanatory Analysis
          • Univariate Exploration of Data
            • Bar Chart
            • Pie Charts
            • Histograms
            • Kernel Density Estimation
            • Figures, Axes, and Subplots
            • Choosing a Plot for Discrete Data
            • Scales and Transformations (Log)
          • Bivariate Exploration of Data
            • Scatterplots
            • Overplotting, Transparency, and Jitter
            • Heatmaps
            • Violin & Box Plots
            • Categorical Variable Analysis
            • Faceting
            • Line Plots
            • Adapted Bar Charts
            • Q-Q, Swarm, Rug, Strip, Stacked, and Rigeline Plots
          • Multivariate Exploration of Data
            • Non-Positional Encodings for Third Variables
            • Color Palettes
            • Faceting for Multivariate Data
            • Plot and Correlation Matrices
            • Other Adaptations of Bivariate PLots
            • Feature Engineering for Data Viz
        • Python - Cheat Sheet
    • Machine Learning
      • Courses
        • Practical Deep learning for coders
          • Convolutional Neural Networks
            • Image Restauration
            • U-net
          • Lesson 1
          • Lesson 2
          • Lesson 3
          • Lesson 4 NLP, Collaborative filtering, Embeddings
          • Lesson 5 - Backprop, Accelerated SGD
          • Tabular data
        • Fast.ai - Intro to ML
          • Neural Nets
          • Business Applications
          • Class 1 & 2 - Random Forests
          • Lessons 3 & 4
      • Unsupervised Learning
        • Dimensionality Reduction
          • Independant Component Analysis
          • Random Projection
          • Principal Component Analysis
        • K-Means
        • Hierarchical Clustering
        • DBSCAN
        • Gaussian Mixture Model Clustering
        • Cluster Validation
      • Preprocessing
      • Machine Learning Overview
        • Confusion Matrix
      • Linear Regression
        • Feature Scaling and Normalization
        • Regularization
        • Polynomial Regression
        • Error functions
      • Decision Trees
      • Support Vector Machines
      • Training and Tuning
      • Model Evaluation Metrics
      • NLP
      • Neural Networks
        • Perceptron Algorithm
        • Multilayer Perceptron
        • Neural Network Architecture
        • Gradient Descent
        • Backpropagation
        • Training Neural Networks
  • Business
    • Analytics
      • KPIs for a Website
  • Books
    • Statistics
      • Practice Statistics for Data Science
        • Exploring Binary and Categorical Data
        • Data and Sampling Distributions
        • Statistical Experiments and Significance Testing
        • Regression and Prediction
        • Classification
        • Correlation
    • Pragmatic Thinking and Learning
      • Untitled
    • A Mind For Numbers: How to Excel at Math and Science
      • Focused and diffuse mode
      • Procrastination
      • Working memory and long term memory
        • Chunking
      • Importance of sleeping
      • Q&A with Terrence Sejnowski
      • Illusions of competence
      • Seeing the bigger picture
        • The value of a Library of Chunks
        • Overlearning
Powered by GitBook
On this page
  • Matrix Multiplication
  • Elimination
  • Cost of the operations

Was this helpful?

  1. Sciences
  2. Math
  3. Linear Algebra
  4. Matrices

Matrix Multiplication

PreviousMatricesNextInverses and Transpose and permutations

Last updated 5 years ago

Was this helpful?

Matrix Multiplication

The product of two matrices is defined only when the number of columns of the first matrix is the same as the number of rows of the second; in other words, it is only possible to multiply m x n and n x p size matrices. The reason for this becomes clear upon defining the product:

Another way to see is this:

and

Another way to look at multiplication through columns

4th way to multiply

Also, the multiplication of matrices need not be commutative. Therefore AB =/= BA generally.

When AB = BA then AB are said to commute. This is the case of the identity matrix.

Elimination

Say you have this matrix you want to solve:

Find the matrix you need to multiply it by to solve it. Or to get U (upper triangle which allows you to easily solve the system.

Step 1 : You want a 0 in position (2, 1) and (3,1) - but in this example there is already a 0 in (3,1).

Step 2: You want a 0 in position (3,2). So in this example you subtract 2 x row 2 from row 3 to get the 0.

So now, z=5, and you can easily solve the rest.

You can change the order in which you do multiplications with matrix multiplication.

Let's finish the topic of elimination.‌

Ex: 2 x 2 elimnation. Let A be a matrix as below where I can do elimination, but no pivots. I want to get from A to U to solve A. But then I want to know how is A related to U where there is a matrix L where A = LU. How do you get there?‌

First, to solve A and get U (that is then easy to solve), I multiply by my elementary matrix at the position 2, 1 (E(2,1)) because that is how I get a 0 in position 2,1. Therefore we get:​‌

Then to get A = L U: you need to multiply E, (2,1) by the inverse which becomes L (-4 becomes 4).​‌

Where L is the Lower triangle, and U is the Upper triangle.

Now let's try to do this is a 3x3 matrix. What are the steps to producing elimination?

Now, suppose we want all the E's on the right hand side of the equation. We get:

So L is the product of inverses. Why do we do this? Because when you multiply the non inverses, you don't get a good matrix. The multipliers go directly into L (see the number 10 in the matrix below). However, when you multiply the inverses, you get a clean L with no 0 in position 3,1. See:

Cost of the operations

Say you have a matrix nxn where n=100. How many operations will we have to do with elimination? Turns out that the operations for A and b (Ax = b) there will be: