A hands-on marimo built, math-first implementations of Matrix Decomposition Functions,
find the notebooks; hosting on molab & hf-spaces.
A curated set of marimo notebooks based on Matrix Decomposition functions, written in Python, each pairing a mathematical derivation with annotated Python including an interactive visualization, inside a single reactive environment.
The series is a progressive build, starting from orthogonalization fundamentals and working toward full matrix factorizations and applications:
Gram-Schmidt → QR → LU → Householder → SVD → PCA
These functions reduce computationally expensive operations i.e. inversion, least squares, eigensolving, into sequences of simpler, numerically stable factors.
Applications such as noise reduction, signal processing, image compression and more will be covered as the series progresses.
| Notebook | Open in molab | Open in HF Spaces |
|---|---|---|
| Gram-Schmidt Orthogonalization | ||
| QR Decomposition | 🔜 | 🔜 |
| Householder Reflection & Bidiagonalization | 🔜 | 🔜 |
Requires Python >= 3.12 and uv.
1. Clone and install dependencies
git clone https://github.com/prgyn8/Matrix-Decomposition-Implementations.git
uv sync2. Run a marimo app, (eg. gram-schmidt process)
uvx marimo run apps/gs_process.py # you can find the available notebooks in the apps directory.3. Optionally, run a notebook in sandbox environment
# Run the app
uvx marimo run --sandbox apps/gs_process.py
# Or open for editing
uvx marimo edit --sandbox apps/gs_process.pyGram-Schmidt Orthogonalization
## snippet from the notebook : https://molab.marimo.io/notebooks/nb_TAVLehyiE58b5RDzjxFxSW
def gram_schmidt(X:np.ndarray)->np.ndarray:
"""
original -> orthogonal -> orthonormal
args:
A set of linearly independent vectors stored in columns in the array X.
returns:
Returns matrix Q of the shape of X, having orthonormal vectors for the given vectors.
"""
Q = np.copy(X).astype("float64")
n_vecs = Q.shape[1]
# defining a function to compute the L2-norm
length = lambda x: np.linalg.norm(x)
# iteration with each vector in the matrix X
for nth_vec in range(n_vecs):
# iteratively removing each preceding projection from nth vector
for k_proj in range(nth_vec):
# the dot product would be the scaler coefficient
scaler = Q[:,nth_vec] @ Q[:,k_proj]
projection = scaler * Q[:,k_proj]
Q[:,nth_vec] -= projection # removing the Kth projection
norm = length(Q[:,nth_vec])
# handling the case if the loop encounters linearly dependent vectors.
# Since, they come already under the span of vector space, hence their value will be 0.
if np.isclose(norm,0, rtol=1e-15, atol=1e-14, equal_nan=False):
Q[:,nth_vec] = 0
else:
# making orthogonal vectors -> orthonormal
Q[:,nth_vec] = Q[:,nth_vec] / norm
return Q# Verification: Q.T @ Q ≈ I
A = np.array([[1, 0, 0], [2, 0, 3], [4, 5, 6]]).T
assert np.allclose(gram_schmidt(A).T @ gram_schmidt(A), np.eye(3)) # ✓💬 Questions on implementation or numerical stability? Start a thread in Discussions.
Contributions are welcome, whether it's a bug report, a new decomposition technique, or a clearer explanation of the math.
- Fork the repository
- Sync dependencies:
uv sync - Create a branch for your changes
- Open a Pull Request — maintainers will review it
For questions, suggestions, or discussion of the mathematics:
- Wikipedia — Gram-Schmidt Process — foundational definitions and mathematical references
- DataCamp — Orthogonal Matrices — accessible article on orthogonality
- MIT OpenCourseWare — Lecture 17 — in-depth treatment by Prof. Gilbert Strang
- Steve Brunton — original spark for this project; exceptional intuition on engineering applications of linear algebra
- Graphical Linear Algebra — visual treatment of orthogonality and projections