Probability Identities

1. Conditional Bayes’ Formula

Let $A, B, C$ be events. Then

$$ \frac{P(A \mid B, C)}{1} = \frac{P(C \mid A, B) , P(A \mid B)}{P(C \mid B)}. $$

Application.
In estimating whether a coin is biased, suppose the first toss is $H$ and the second toss is also $H$:

  • $A =$ “the coin is biased”
  • $B =$ “first toss is $H$”
  • $C =$ “second toss is $H$”

Then Bayes’ rule lets you update the posterior after observing the second toss without recomputing from scratch.


Linear Algebra Identities

1. Sylvester’s Criterion

A matrix is positive semidefinite if and only if all principal minors have nonnegative determinant.

2. Diagonal Dominance

A matrix that is symmetric and diagonally dominant is positive definite.

3. Lévy–Desplanques Theorem

See:
https://planetmath.org/levydesplanquestheorem

4. Covariance vs.\ Correlation Matrix

Let $C$ be the covariance matrix and $\tilde{C}$ the correlation matrix.
Let $D$ be diagonal with entries equal to marginal standard deviations. Then

$$ C = D \tilde{C} D. $$


Arithmetic Identities

1. Geometric Series

1.1 Infinite geometric series ($|p|<1$)

$$ \sum_{k=0}^{\infty} p^k = \frac{1}{1 - p}. $$

Proof. Let $S = \sum_{k=0}^{\infty} p^k$. Then

  • $pS = p + p^2 + p^3 + \cdots$
  • $S = 1 + p + p^2 + p^3 + \cdots$

Subtract:

$$ (1 - p) S = 1 ;\Rightarrow; S = \frac{1}{1 - p}. $$

1.2 Finite geometric series

$$ \sum_{k=0}^{n - 1} p^k = \frac{1 - p^n}{1 - p}. $$


2. Geometric Series II (Derivative Trick)

For $|p|<1$,

$$ \frac{1}{(1 - p)^2} = \sum_{k=1}^{\infty} k p^{k - 1}. $$


3. Summation I

$$ \sum_{k=1}^{n} k = \frac{n(n + 1)}{2}. $$


4. Summation II

$$ \sum_{k=1}^{n} k^2 = \frac{n(n+1)(2n+1)}{6}. $$

Proof (Telescoping)

Consider

$$ \sum_{k=1}^{n} \big[ (k+1)^3 - k^3 \big]. $$

The left side equals $(n+1)^3 - 1$.
The right side:

$$ (k+1)^3 - k^3 = 3k^2 + 3k + 1. $$

So

$$ (n+1)^3 - 1 = 3 \sum_{k=1}^n k^2 + 3 \sum_{k=1}^n k + n. $$

Substitute $\sum k = \frac{n(n+1)}{2}$ and solve for $\sum k^2$:

$$ \sum_{k=1}^n k^2 = \frac{1}{3} n^3 + \frac{1}{2} n^2 + \frac{1}{6} n. $$


5. Cauchy–Schwarz Inequality

(a) Summation Form

$$ \left( \sum_{i=1}^n a_i b_i \right)^2 \le \left( \sum_{i=1}^n a_i^2 \right) \left( \sum_{i=1}^n b_i^2 \right). $$

(b) Integral Form

$$ \left( \int fg \right)^2 \le \left( \int f^2 \right) \left( \int g^2 \right). $$

(c) Expectation Form

$$ \mathbb{E}[XY]^2 \le \mathbb{E}[X^2] , \mathbb{E}[Y^2]. $$

(d) Covariance Form

$$ \mathrm{Cov}(X,Y)^2 \le \mathrm{Var}(X) , \mathrm{Var}(Y). $$