This is the 27th day of my participation in the August Genwen Challenge.More challenges in August

Normal distribution

concept

Normal distribution is also called normal distribution, Gaussian distribution. It looks like this

It is high in the middle and low on both sides. When the probability density function obeys

Is a normal distribution, that is, the expected value is the symmetric probability density values of the left and right sides of the center.

What are the basic properties of a normal distribution?

The non-skewness distribution — that is, the expectation is equal to the median. From the graphic characteristics of the normal distribution, it can be known that most samples fall around the expected value, which leads to the related problems of 3σ. So how does the three sigma approach relate to normal distributions? Three sigma means that 68.27%, 95.45%, and 99.73% of the samples fall within the sigma, two sigma, and three sigma ranges respectively. The probability of a sample falling outside of three sigma is only 0.27%, so it is not a random error, but rather a gross error that should be eliminated.

How to introduce normal distribution in plain language

Take grades and height for example – grades – everyone’s grades are different, some people are smart and diligent, the grades are better, some people are lazy, the grades are worse. A final examination of the whole grade of students by the number of statistics, drawn on the paper, will find that most of the people are in the middle of the range. A few people can get extremely good grades, a few people can get extremely bad grades. The final picture is going to be high on both sides, low in the middle, and that’s a normal distribution.

The law

The law of large numbers means that the random experiment corresponding to random variable X is repeated many times, and the mean value of X tends to E(X) with the increase of random times. There are generally three laws of large numbers.

  1. Sinchin’s law of large Numbers independent and identically distributed random variable E(X) = μ Arithmetic mean converges to μ in probability

  2. Bernoulli’s law of Large Numbers μ -n Number of independent trials in which event A occurs. The probability of event A occurring in each trial is P relative to the frequency. For large n, the probability of event A is equal to the frequency of event A

  3. Chebyshev’s law of Large Numbers [true value of sample mean] Chebyshev’s law of large numbers is more general than Sinchin’s law of large numbers — do not seek identical distributions, only independent or unrelated — briefly describe the common laws of large numbers and the differences between them

The laws of The distribution of expect The variance conclusion
Sinchin’s law of large numbers Independent and identically distributed The same The same To estimate the expectation
Bernoulli’s law of large numbers The binomial distribution The same The same Frequency equals probability – estimated probability
Chebyshev’s law of large numbers Independent or unrelated There are There are To estimate the expectation

Central limit theorem

Let X 1-n be a set of independent and identically distributed random variables, E(X)=μ, D(X) = σ², then when n is large enough, the distribution of the mean is close to the normal distribution n. Will mean standardizing, they can get close to the N (0, 1) of the standard normal distribution With the increase of sample size, the distribution of average increasingly tend to be normal distribution With the increase of the number of test – a set of variables (independent identically distributed average can approximate as obey normal distribution, and variance will be increased with the increase of the number of smaller