Books
- Mathematics for Machine Learning (.pdf)
- Deep Learning, Part I: Applied Math and Machine Learning Basics
Informal articles
Mathematics of Machine Learning blog
Linear algebra
- Computational linear algebra (fast.ai)
- Introduction to linear algebra (Strang, MIT)
- Linear algebra review and reference (Kolter/Do, Stanford)
Matrix manipulation
Calculus
Statistics
- Review of probability theory (Maleki/Do, Stanford)
- Probability course
- Statistics 110: probability
- Statistical Rethinking - Richard McElreath
Bayes
Probability
Why probability probably doesn’t exist (but it is useful to act like it does)
Complex numbers
Formal proofs
Software tools
-
pyro-ppl/numpyro: Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU. + NumPyro docs: NumPyro documentation — NumPyro documentation
Syntax ref: Numpy einsum
$a_m = m^{th} \text{ element of } \mathbf{a}$
$\mathbf{a}_m = \sum_m{\mathbf{a}}$
$\mathbf{a}_{(m)} =\text{ all of } \mathbf{a} \text{’s } m \text{ elements} $
One-dimensional array:
>>> a = np.array([1, 2, 3, 4])
>>> np.einsum("m->", a)
10
>>> np.einsum("m->m", a)
array([1, 2, 3, 4])
One-dimensional arrays:
>>> a = np.array([1, 2, 3])
>>> b = np.array([-4, 5, -6])
>>> # dot product
>>> a @ b
-12
>>> np.einsum("m,m->", a, b)
-12
>>> # Element-wise product
>>> a * b
array([-4, 10, -18])
>>> np.einsum("m,m->m", a, b)
array([-4, 10, -18])
Higher-dimension arrays:
- Einsum form of $ A_{(i),m} B_{m,(j)} = im,mj -> ij $
>>> A = np.array([[1, 2], [-3, 4]])
>>> B = np.array([[0, 1], [1, 0]])
>>> A @ B
array([[ 2, 1],
[ 4, -3]])
>>> np.einsum("im,mj->ij", A, B)
array([[ 2, 1],
[ 4, -3]])