-
**Union : **
$$p(A \lor B) = p(A)+p(B)-p(A \land B)$$ -
**Joint : **
$$p(A, B) = p(A \land B) = p(A|B)p(B)$$ $$p(A) = \displaystyle\sum_b p(A, B) = \displaystyle\sum_bp(A|B = b)p(B = b)$$ -
**Conditional : **
$$p(A|B) = \dfrac{p(A, B)}{p(B)} \space \space if \space p(B) > 0$$ -
**Bayes-Rule : **
$$p(X =x|Y =y) = \dfrac{p(X =x,Y =y)}{p(Y =y)} = \dfrac{p(X =x)p(Y =y|X =x)}{\sum_{x'}p(X ={x'})p(Y =y|X =x')}$$ -
**Un-conditional Independent : **
$$p(X,Y)=p(X)p(Y)$$ -
**Conditional Independent : **
$$p(X, Y |Z) = p(X|Z)p(Y |Z)$$
For continues data
Examples
- Uniform Distribution
- Normal/Gaussian Distribution
- Exponential Distribution
For discrete data
Examples
- Binomial probability mass function
- Poisson probability mass function
Discrete : $$ E[X] \triangleq \displaystyle\sum_{x \in X} x \space p(x) $$
Continues :
In a dataset the point at which x% of the data is less than that value
Moments in mathematical statistics involve a basic calculation. These calculations can be used to find a probability distribution's mean, variance and skewness.
In a discrete random variable,
The $$s$$th moment of the data set with values
First Moment is Simple Mean
Moments about the mean
- First moment about mean is zero
- Second moment about mean is variance
- Third moment about mean is skew
- Fourth moment about mean is kurtosis
The above applies same for continues random variable
Describe the degree to which two random variables or sets of random variables tend to deviate from their expected values in similar ways
If x is a d-dimensional random vector, its covariance matrix is defined to be the following symmetric, positive definite matrix:
Covariances can be between 0 and infinity. Sometimes it is more convenient to work with a normalized measure, with a finite upper bound.
If there is a linear relationship between
If X and Y are independent, meaning