constituents \mu the mean \sigma the variance requirements \begin{equation} X \sim N(\mu, \sigma^{2}) \end{equation} Its PDF is:
where, \phi is the standard normal density function Its CDF:
We can’t integrate \Phi further. So we leave it as a special function. And its expectations: E(X) = \mu Var(X) = \sigma^{2} additional information multi-variant Gaussian density \begin{equation} z \sim \mathcal{N} \left(\mu, \Sigma\right) \end{equation} then
linear transformations on Gaussian For some:
where X \sim \mathcal{N} We will end up with another normal Y \sim \mathcal{N} such that: mean: au + b variance: a^{2}\sigma^{2} standard normal The standard normal is:
mean 0, variance 1. You can transform anything into a standard normal via the following linear transform: transformation into standard normal \begin{equation} X \sim \mathcal{N}(\mu, \sigma^{2}) \end{equation} and, we can shift it into a standard normal with:
therefore, we can derive what the CDF of the normal distribution by shifting it back into the center:
normal maximizes entropy no other random variable uses as little parameters to convey as much information approximation of binomial distribution with normal distribution You can use a normal distribution to approximate binomial approximation. However, be aware of a continuity correction adding Gaussian distributions for independent:
conditioning Gaussian distributions For distributions that follow Gaussian distributions, a, b, we obtain:
meaning, each one can be marginalized as:
Conditioning works too with those terms, for a|b:
standard normal density function This is a function used to model many Gaussian distributions.
This function is the CDF of the standard normal. standard normal density function is also symmetric: