Basic Matrix Operations using loops

Next Topic(s):

Created:
18th of September 2025
01:37:48 PM
Modified:
18th of September 2025
01:51:42 PM

Matrix Multiplication & Inversion using Python Lists

This page teaches how to represent matrices as nested Python list objects and how to multiply them (2D, 3D batched, and the general rule). We then cover inversion (2×2 shortcut and Gauss–Jordan) and solve simultaneous equations. Everything is pure lists—no external libraries—so learners see the mechanics clearly.

What you will learn

  • Representing matrices and tensors (2D and 3D) as lists of lists (and stacks of pages).
  • Shape rules: when multiplication is valid and what the result shape will be.
  • Pure Python 2D and batched (3D) multiplication.
  • Inversion: 2×2 formula and general Gauss–Jordan method.
  • Solving A x = b with worked Indian-context examples.
💡

Tip: Think of a 3D “matrix” as a stack of 2D sheets (pages). Pair each page in the first stack with the page in the second stack and multiply page-wise.

Flowchart: Matrix Multiply (2D & Batched) — Fixed

The flow below is simplified to avoid Mermaid parsing errors and works in Mermaid 8.13.

flowchart TD
  A([Start]) --> B[Read shapes of A and B]
  B --> C{2D or 3D}
  C -->|2D| D{cols(A) == rows(B)?}
  D -->|No| X([Error: Incompatible])
  D -->|Yes| E[Initialise C: rows(A) x cols(B)]
  E --> F[Triple loop: C[i][j] += A[i][k]*B[k][j]]
  F --> G([Return C])

  C -->|3D (batched)| H{pages(A) == pages(B)?}
  H -->|No| X
  H -->|Yes| I[For each page p: multiply A[p] x B[p]]
  I --> G

Sequence Diagram: Row × Column

sequenceDiagram
    participant User
    participant Rows as Rows of A
    participant Cols as Columns of B
    participant C as Entry C[i][j]
    User->>Rows: Pick row i
    User->>Cols: Pick column j
    loop k = 0..n-1
        Rows->>C: A[i][k] * B[k][j]
        Cols->>C: Accumulate sum
    end
    C-->>User: Dot-product value

Shape Rules (at a glance)

Compatibility & Result Shapes
A shape B shape Multiply? Result
(m × n) (n × p) Yes (m × p)
(b, m × n) (b, n × q) Yes (page-wise) (b, m × q)
(m × n) (p × q) No (unless n = p)
(b, m × n) (q, n × r) No (unless b = q)
Square (n × n) Square (n × n) Yes (n × n)
💡

Rule of thumb: The inner dimensions must match. The result takes the “outer” dimensions.

Python Helpers (drop-in)

How to use: Drop the helpers below into your file and run the code examples as-is.

# --- Shapes & helpers ---
def mat_shape(M):
    """
    Returns (rows, cols) for a 2D list-of-lists matrix M.
    Assumes rectangular rows. For vectors, represent as n×1 or 1×n lists.
    """
    rows = len(M)
    cols = len(M[0]) if rows > 0 and isinstance(M[0], list) else 0
    return (rows, cols)

def zeros(r, c):
    """Create an r×c matrix filled with 0."""
    return [[0 for _ in range(c)] for _ in range(r)]

def eye(n):
    """Create the n×n identity matrix."""
    I = zeros(n, n)
    for i in range(n):
        I[i][i] = 1
    return I

# --- 2D matrix multiply (lists only) ---
def mat_mul(A, B):
    """
    Multiply A (ra×ca) by B (rb×cb) where ca == rb.
    Uses cache of A[i][k] inside loops for efficiency.
    """
    ra, ca = mat_shape(A)
    rb, cb = mat_shape(B)
    if ca != rb:
        raise ValueError(f"Incompatible shapes: {ra}x{ca} cannot multiply {rb}x{cb}")
    C = zeros(ra, cb)
    for i in range(ra):
        for k in range(ca):
            aik = A[i][k]
            if aik == 0:
                continue
            for j in range(cb):
                C[i][j] += aik * B[k][j]
    return C

# --- 3D batched multiply: A[b][m][n] × B[b][n][q] -> C[b][m][q] ---
def batch_mat_mul(A_stack, B_stack):
    """
    Page-wise multiply each A_stack[p] × B_stack[p].
    len(A_stack) must equal len(B_stack).
    """
    if len(A_stack) != len(B_stack):
        raise ValueError("Different number of pages in A and B")
    pages = len(A_stack)
    result = []
    for p in range(pages):
        result.append(mat_mul(A_stack[p], B_stack[p]))
    return result

# --- Gauss–Jordan inverse (n×n) ---
def inverse(A):
    """
    Compute inverse of a square matrix A via Gauss–Jordan elimination.
    Raises ValueError if A is singular or not square.
    """
    n, m = mat_shape(A)
    if n != m:
        raise ValueError("Inverse defined only for square matrices")
    # Build augmented [A | I]
    I = eye(n)
    aug = [A[i][:] + I[i][:] for i in range(n)]

    # Forward elimination & normalisation
    for col in range(n):
        # Find pivot row at or below 'col' with non-zero pivot
        pivot = None
        for r in range(col, n):
            if aug[r][col] != 0:
                pivot = r
                break
        if pivot is None:
            raise ValueError("Matrix is singular; no inverse")
        # Row swap if needed
        if pivot != col:
            aug[col], aug[pivot] = aug[pivot], aug[col]
        # Scale pivot row to make pivot 1
        factor = aug[col][col]
        for j in range(2*n):
            aug[col][j] = aug[col][j] / factor
        # Eliminate other rows in current column
        for r in range(n):
            if r == col:
                continue
            factor = aug[r][col]
            if factor != 0:
                for j in range(2*n):
                    aug[r][j] = aug[r][j] - factor * aug[col][j]
    # Extract right half => A^{-1}
    return [row[n:] for row in aug]

# --- 2×2 inverse (shortcut for teaching/demo) ---
def inverse_2x2(A):
    """
    Inverse of [[a, b], [c, d]] if det != 0.
    """
    (a, b), (c, d) = A
    det = a*d - b*c
    if det == 0:
        raise ValueError("Singular 2×2 matrix")
    return [[ d/det, -b/det],
            [-c/det,  a/det]]

# --- Solve A x = b with inverse (b as a column vector, n×1) ---
def solve_via_inverse(A, b):
    invA = inverse(A)
    return mat_mul(invA, b)

Helpers explained (clear & concise)

  • mat_shape(M): Returns (rows, cols). Assumes each row has the same length. For a column vector like [[5],[7]], result is (2,1).
  • zeros(r, c): Builds an r×c matrix of zeros. Handy for initialising results.
  • eye(n): Builds the identity matrix I so that AI = IA = A for square A.
  • mat_mul(A, B): Standard triple-loop multiplication. Optimisation: caches A[i][k] and skips if it’s zero.
  • batch_mat_mul(A_stack, B_stack): Treats each element as a “page” and calls mat_mul on corresponding pages. Requires same number of pages.
  • inverse(A): Gauss–Jordan elimination on the augmented matrix [A | I]. Makes a pivot 1, zeroes other entries in the pivot’s column, repeats across columns, and returns the right half.
  • inverse_2x2(A): Quick formula for 2×2 matrices; good for demonstrations.
  • solve_via_inverse(A, b): Computes x = A^{-1}b using the inverse. Use b as an n×1 column vector.

Examples: 2D Matrix Multiplication

Example 1: 2×3 × 3×2

Data

# Drop the helpers above into your file, then run this:
A = [
  [1, 2, 3],
  [4, 5, 6]
]  # shape 2×3

B = [
  [7,  8],
  [9, 10],
  [11,12]
]  # shape 3×2

C = mat_mul(A, B)
print(C)  # [[58, 64], [139, 154]]
💡

Check: C[0][0] = 1×7 + 2×9 + 3×11 = 58. Inner dimensions 3 and 3 match; result is 2×2.

Examples: 3D (Batched) Multiplication

Example 2: Page-wise (2 pages)

Expanded explanation: Each page multiplies its own A[p] × B[p]. Imagine two stacks of paper. Page p in the first stack is a 2D matrix A[p]; page p in the second stack is a 2D matrix B[p]. We multiply these two pages using ordinary 2D rules to get C[p]. We do this for every page index p. The page count (batch size) must match, and for each page the inner dimensions must match. If A is shaped (pages, m×n) and B is (pages, n×q), then C will be (pages, m×q). This is the same idea deep-learning libraries use for batched matrix multiply: operate page-wise (pair-wise) without cross-talk between pages.

# Drop the helpers above into your file, then run this:
A_stack = [
  [[1, 0], [0, 1]],     # page 0 (Iâ‚‚)
  [[2, 1], [0, 3]]      # page 1
]  # shape (2 pages, 2×2)

B_stack = [
  [[5, 6], [7, 8]],     # page 0
  [[1, 2], [3, 4]]      # page 1
]  # shape (2 pages, 2×2)

C_stack = batch_mat_mul(A_stack, B_stack)
for p, C in enumerate(C_stack):
    print("Page", p, "->", C)
# Expected:
# Page 0 -> [[5, 6], [7, 8]]
# Page 1 -> [[5, 8], [9, 12]]

Matrix Inversion

2×2 Quick Formula

# Drop the helpers above into your file, then run this:
A2 = [[4, 7],
      [2, 6]]
print(inverse_2x2(A2))
# [[0.6, -0.7], [-0.2, 0.4]]

General n×n via Gauss–Jordan

# Drop the helpers above into your file, then run this:
A3 = [
  [1, 1, 1],
  [2, 3, 7],
  [3, 4, 1]
]
invA3 = inverse(A3)
print(invA3)

# Verify A·A^{-1} ≈ I:
I3 = mat_mul(A3, invA3)
print("A·A^{-1} =", I3)

Solving Simultaneous Equations (A x = b)

Example 3: 2 Equations, 2 Unknowns (Applied)

Canteen pricing: Two idli + one dosa cost ₹80; one idli + two dosa cost ₹110. Find prices.

# Drop the helpers above into your file, then run this:
# 2x + 1y = 80
# 1x + 2y = 110
A = [[2, 1],
     [1, 2]]
b = [[80],
     [110]]
x = solve_via_inverse(A, b)   # [[price_idli], [price_dosa]]
print("Prices (₹):", x)
# [[50.0], [30.0]]

Example 4: 3 Equations, 3 Unknowns

# Drop the helpers above into your file, then run this:
#  x +  y +  z = 6
# 2x + 3y + 7z = 0
# 3x + 4y +  z = 4
A = [
  [1, 1, 1],
  [2, 3, 7],
  [3, 4, 1]
]
b = [
  [6],
  [0],
  [4]
]
print(solve_via_inverse(A, b))

Worked Walkthrough: From Lists to Answers

  1. Represent each matrix as lists-of-lists; vectors as n×1 lists.
  2. Confirm shapes using mat_shape and apply the inner-dimension rule.
  3. Multiply using mat_mul, or batch_mat_mul for stacks.
  4. Invert with inverse (or inverse_2x2 for 2×2).
  5. Solve A x = b using solve_via_inverse.

Flowchart: Solving A x = b

flowchart TD
  S([Start]) --> A1[Input A (n×n), b (n×1)]
  A1 --> C1{det(A) ≠ 0?}
  C1 -->|No| E1([No unique solution])
  C1 -->|Yes| B1[Compute A^{-1} via Gauss–Jordan]
  B1 --> B2[Compute x = A^{-1} · b]
  B2 --> F1([Return x])

Practice Tasks

  • Multiply a 3×4 by a 4×3 and verify the result shape is 3×3.
  • Batched: (3 pages) each page 2×3 by 3×1 (vector). Print all page results.
  • Invert a non-singular 3×3 and verify A · A^{-1} = I.
  • Model a simple supply-mix problem as A x = b and solve for x.
  • Refactor mat_mul to reuse a pre-allocated result matrix for speed.

Key Takeaways

  • Lists of lists make the mechanics transparent for learning.
  • Inner-dimension match decides if multiplication is valid.
  • Batched multiplication is just page-wise 2D multiplication.
  • Invert only square, non-singular matrices; handle errors clearly.
  • Use the inverse to solve small systems; prefer elimination for large systems in production.