Just some experiments with linear algebra.
So far:
- QR decomp
- Projection via Solving Normal Equation
- Principal compoenent analysis with SVD
- Moore Penrose inverse calculation with SVD
- Multiple linear regression solving with MP inverse
- Random projection and Johnson-Lindenstrauss Lemma
- Linear regression for polynomials
Multivariate Linear Regression
Plotting estimated covariats against samples
Polynomial Linear Regression
Plotting estimate polynomial against ground truth and sample points
JL-Leamma and Random Projections
Projected 3000 data points from 10000 dimensions with random projection matrices between 100 and 3000 dimensions.
The y-axis is the average distortion factor, the dots are coloured green if no distortion was above 0.1.

Principal Component Analysis
Plots k-dimensional reconstruction error (Forbenius norm) of a randomly generated matrix using PCA.
We can infer that in this case, the randomly generated matrix must have had a rank of 6.