Linear Algebra - Matrix Operations with NumPy
The Kitchen Table Adventure 🍽️
Imagine you have a big kitchen table. On it, you place different foods in rows and columns. That’s what a matrix is - numbers arranged like food on a table!
Today, we’ll learn how NumPy helps us cook up amazing math recipes with these tables of numbers.
1. Dot Product - The Handshake Score
What is it?
Think of two kids meeting. Each kid has toys (numbers). They shake hands by:
- Matching their toys one-by-one
- Multiplying matched pairs
- Adding everything up
That final sum? That’s the dot product!
Simple Example
import numpy as np
# Kid A has: 1 apple, 2 oranges, 3 bananas
kid_a = np.array([1, 2, 3])
# Kid B has: 4 apples, 5 oranges, 6 bananas
kid_b = np.array([4, 5, 6])
# Handshake score:
# (1×4) + (2×5) + (3×6) = 4 + 10 + 18 = 32
score = np.dot(kid_a, kid_b)
print(score) # Output: 32
Real Life Use
- Shopping: Price × Quantity for each item, then add up = Total bill!
- Games: How similar are two players’ scores?
2. Matrix Multiplication - The Recipe Machine
What is it?
Imagine a recipe machine:
- Put ingredients in (first matrix)
- Machine has instructions (second matrix)
- Out comes the cooked dish (result matrix)
The rows of the first matrix “talk to” the columns of the second.
The Rule
For matrix multiplication to work:
First matrix columns must equal Second matrix rows
Like puzzle pieces fitting together!
Simple Example
import numpy as np
# 2 recipes, each needs 3 ingredients
recipes = np.array([
[1, 2, 3], # Recipe 1
[4, 5, 6] # Recipe 2
])
# 3 ingredients, 2 price options
prices = np.array([
[7, 8], # Ingredient 1 prices
[9, 10], # Ingredient 2 prices
[11, 12] # Ingredient 3 prices
])
# Cost of each recipe at each store
result = np.matmul(recipes, prices)
# Or: result = recipes @ prices
print(result)
# [[58, 64],
# [139, 154]]
graph TD A["Matrix A<br/>2×3"] --> C["Result<br/>2×2"] B["Matrix B<br/>3×2"] --> C style C fill:#90EE90
3. Elementwise vs Matrix Multiply
The Big Difference
Elementwise (*) = Each box talks to its twin Matrix Multiply (@) = Rows dance with columns
Think of it like…
| Operation | Analogy |
|---|---|
* (Elementwise) |
Two friends wear matching outfits |
@ (Matrix Multiply) |
A chef combining ingredients |
Simple Example
import numpy as np
a = np.array([[1, 2],
[3, 4]])
b = np.array([[5, 6],
[7, 8]])
# Elementwise: 1×5, 2×6, 3×7, 4×8
elementwise = a * b
print(elementwise)
# [[ 5, 12],
# [21, 32]]
# Matrix multiply: rows × columns
matrix_mult = a @ b
print(matrix_mult)
# [[19, 22],
# [43, 50]]
Quick Rule
| You want… | Use this |
|---|---|
| Scale each number separately | * |
| Transform/rotate data | @ |
4. Inner and Outer Products
Inner Product - The Squeeze
Two vectors get squeezed into ONE number.
Same as dot product for regular vectors!
import numpy as np
a = np.array([1, 2, 3])
b = np.array([4, 5, 6])
inner = np.inner(a, b)
print(inner) # 32
# Same as: 1×4 + 2×5 + 3×6
Outer Product - The Explosion
Two vectors explode into a full matrix!
Every element of first meets every element of second.
import numpy as np
a = np.array([1, 2, 3])
b = np.array([4, 5])
outer = np.outer(a, b)
print(outer)
# [[ 4, 5], # 1×4, 1×5
# [ 8, 10], # 2×4, 2×5
# [12, 15]] # 3×4, 3×5
graph TD A["Inner Product"] --> B["1 Number"] C["Outer Product"] --> D["Full Matrix"] style B fill:#FFB6C1 style D fill:#87CEEB
Memory Trick
- Inner = Goes inside (shrinks to 1 number)
- Outer = Goes outside (grows to matrix)
5. Covariance and Correlation
Covariance - Do They Move Together?
Imagine two friends on swings:
- Both go UP together = Positive covariance
- One UP, one DOWN = Negative covariance
- Random movement = Zero covariance
import numpy as np
# Height and weight of 5 people
height = np.array([150, 160, 170, 180, 190])
weight = np.array([50, 60, 65, 75, 85])
# Covariance matrix
cov_matrix = np.cov(height, weight)
print(cov_matrix)
# [[250, 175], # height-height, height-weight
# [175, 162.5]] # weight-height, weight-weight
The 175 tells us: taller people tend to weigh more!
Correlation - How Strong is the Dance?
Correlation is covariance scaled to -1 to +1.
| Value | Meaning |
|---|---|
| +1 | Perfect match (both rise together) |
| 0 | No relationship |
| -1 | Perfect opposite (one rises, other falls) |
import numpy as np
height = np.array([150, 160, 170, 180, 190])
weight = np.array([50, 60, 65, 75, 85])
corr_matrix = np.corrcoef(height, weight)
print(corr_matrix)
# [[1.0, 0.87], # height-height, height-weight
# [0.87, 1.0]] # weight-height, weight-weight
0.87 = Strong positive relationship!
6. Einstein Summation - The Magic Spell
What is it?
Einstein summation (einsum) is like a magic spell that lets you describe ANY array operation in one line!
You write a pattern and NumPy figures out what to do.
The Secret Code
np.einsum('pattern', arrays)
The letters tell NumPy which dimensions to match.
Basic Examples
Sum all elements:
import numpy as np
a = np.array([[1, 2], [3, 4]])
# 'ij->' means: take all i,j positions, sum to nothing
total = np.einsum('ij->', a)
print(total) # 10
Matrix transpose:
# 'ij->ji' means: swap rows and columns
transposed = np.einsum('ij->ji', a)
print(transposed)
# [[1, 3],
# [2, 4]]
Matrix multiplication:
a = np.array([[1, 2], [3, 4]])
b = np.array([[5, 6], [7, 8]])
# 'ik,kj->ij' means: match k, keep i and j
result = np.einsum('ik,kj->ij', a, b)
print(result)
# [[19, 22],
# [43, 50]]
Dot product:
x = np.array([1, 2, 3])
y = np.array([4, 5, 6])
# 'i,i->' means: match i, sum to scalar
dot = np.einsum('i,i->', x, y)
print(dot) # 32
Why Use It?
| Situation | Einstein is great because… |
|---|---|
| Complex operations | One line instead of many |
| Speed | Often faster than combining multiple functions |
| Clarity | Pattern shows exactly what happens |
graph TD A["Write Pattern"] --> B["NumPy Understands"] B --> C["Fast Result!"] style C fill:#98FB98
Quick Reference Table
| Operation | NumPy Function | Result Shape |
|---|---|---|
| Dot Product | np.dot(a, b) |
Scalar |
| Matrix Multiply | a @ b or np.matmul(a, b) |
Matrix |
| Elementwise | a * b |
Same as inputs |
| Inner Product | np.inner(a, b) |
Scalar (for 1D) |
| Outer Product | np.outer(a, b) |
m × n matrix |
| Covariance | np.cov(a, b) |
2 × 2 matrix |
| Correlation | np.corrcoef(a, b) |
2 × 2 matrix |
| Einstein Sum | np.einsum('...', a, b) |
Depends on pattern |
You Did It! 🎉
You’ve learned how NumPy handles matrix math:
- Dot Product - Handshake score between vectors
- Matrix Multiply - Recipe machine combining tables
- Elementwise vs Matrix - Twins vs Chef
- Inner/Outer Products - Squeeze vs Explosion
- Covariance/Correlation - Do they swing together?
- Einstein Summation - One magic spell for everything
Now go play with these tools! The more you practice, the more natural they become.
“Linear algebra is just organizing numbers and making them dance together!” 💃
