Asked

Modified 7 years, 1 month ago

Viewed 15k times

I found the Lyapunov condition for applying the central limit theorem, which is useful in settings where one has to deal with non-identically distributed random variables:

Lyapunov CLT. Let and let . If there exists s.t.: , then converges to the standard normal distribution.

While I have no problem showing that this condition holds for certain exercises, I’m wondering what the intuition is behind this condition.

0

Start a bounty

Every CLT is basically a rendering of the fact that “many” “not-too-large” and “not-too-correlated” random increments average out in a bell shape. All three conditions (“many”, “not-too-large”, “not-too-correlated”) are important. Lyapunov condition is one way of quantifying the “not-too-large increments” condition.

Naturally each random variable may be unbounded hence the condition bears on their moments. Lyapunov showed that some averaged control on some -moment was enough to guarantee the conclusion. “Averaged” is good here since it allows for some exceptionally large individual -moments. Uniform -integrability is more restrictive, and some variants of the CLT weaken this condition.

The fact that a “not-too-large increments” condition of some sort is required should not come as a surprise, the idea being that if the size of some increment is not negligible then its realization might influence noticeably the result.

This is linked to Lindeberg condition, namely, for each positive ,

We can prove the central limit theorem checking that , where , only for a class of smooth bounded functions. In particular, we don’t need to to the test for all continuous bounded functions. Then with this idea in mind, we can use Taylor’s formula and Lindeberg condition to control the remainder.

Now, if we are in the more favorable case in which we have moments of order for some positive , then Lyapunov’s condition

implies Lindeberg’s one, and is easier to check.

1