1
$\begingroup$

Let $(X_n)_{n\in\mathbb N}$ be a sequence of independent random variables with the same distribution. The common distribution $\mu$ is such that it is symmetric, that is, $\mu((-\infty,x])=\mu([-x,\infty))$ for every $x\in\mathbb R$, and it has no first moment, that is, $$\int_{-\infty}^{\infty}|x|\,\mathrm d\mu(x)=\infty.$$

Are there any well-known results as to whether $$\frac{X_1+\cdots+X_n}{n}$$ converges in law as $n\to\infty$? My preliminary experimentation (and the fact that the average of Cauchy random variables is Cauchy) seems to me to suggest the possibility that the answer may be positive and the limit distribution is Cauchy, but I’m not sure. Any thoughts would be appreciated.


UPDATE: Let me refine my conjecture—my experimentation suggests the refined version holds. Suppose, in addition, that the support of $\mu$ consists of countably infinitely many points on the real line: $$\cdots\qquad-x_3\qquad-x_2\qquad-x_1\qquad0\qquad x_1\qquad x_2\qquad x_3\qquad\cdots$$ where $0<x_1<x_2<x_3<\cdots$ are distinct positive numbers. The probability of the point $\{0\}$ is $p_0\geq0$ and the probability of the points $\{x_k\}$ and $\{-x_k\}$ is $p_k/2\geq 0$ each for every $k\in\mathbb N$. Of course, $p_0+\sum_{k=1}^{\infty}p_k=1$ and, since the first moment does not exist, $$\sum_{k=1}^{\infty}p_kx_k=\infty.\tag{$\star$}$$ Now, the characteristic function of the distribution is $$t\mapsto p_0+\sum_{k=1}^{\infty}p_k\cos(tx_k),$$ so that the characteristic function of the sample mean for any $n\in\mathbb N$ is $$t\mapsto \left[p_0+\sum_{k=1}^{\infty}p_k\cos\left(\frac{t}{n}x_k\right)\right]^n.$$

Under ($\star$), does there exist some $\gamma>0$ such that this expression converges to $$\exp(-\gamma|t|)$$ for every $t\in\mathbb R$ as $n\to\infty$?

$\endgroup$
5
  • 1
    $\begingroup$ If you want the denominator to be $n$, then this is Law of large numbers, not CLT. There are many posts on MSE about the infinite mean case. $\endgroup$ Commented Nov 22 at 4:27
  • $\begingroup$ @KaviRamaMurthy Fair point, although I decided to use the CLT analogy because I expect convergence in law, not almost sure convergence as with LLN. $\endgroup$ Commented Nov 22 at 5:03
  • $\begingroup$ If such a convergence in law holds, then the limit must be a stable random variable. If for example $t^\alpha \mu((t,\infty)) \to c \in (0,\infty)$ for some $\alpha \in (0,1)$ (note that then $\mu$ has no first moment), we would get $n^{-\frac{1}{\alpha}}(X_1+...+X_n) \to Z$, where $Z$ is some non-degenerate stable random variable. In particular, $n^{-1}(X_1+...+X_n) = n^{\frac{1}{\alpha}-1} \cdot n^{-\frac{1}{\alpha}}(X_1+...+X_n)$ cannot converge in law to any non-degenerate random variable. What you search for is a generalized version of CTG for stable distributions. $\endgroup$ Commented Nov 22 at 11:36
  • 1
    $\begingroup$ but with stable distributions with heavier tails than a Cauchy distribution, the distribution of $\frac{X_1+\cdots+X_n}{n}$ is not stable as $n$ increases and you have to divide by something growing faster than $n$. $\endgroup$ Commented Nov 22 at 16:03
  • $\begingroup$ Thank you all for your comments! What if the support of $\mu$ consists entirely of integers (in particular, the support is countable and discrete)? My experimentation suggests that the characteristic function of the mean does converge pointwise to a function of the form $t\mapsto\exp(-\gamma|t|)$ as $n\to\infty$ for some $\gamma>0$, which means the limiting law is Cauchy. $\endgroup$ Commented Nov 22 at 19:31

1 Answer 1

0
$\begingroup$

I commented that with stable distributions with heavier tails than a Cauchy distribution, the distribution of $\frac{X_1+\cdots+X_n}{n}$ is not stable as $n$ increases and you have to divide by something growing faster than $n$.

An example of heavier tails could be taking standard Cauchy distributed random variables and raising their magnitudes to some power greater than $1$ (while preserving their signs to maintain the symmetrical nature of the distribution). You wondered in a comment whether restricting the random variables to support on the integers $\mathbb Z$ might help; it does not, and you would get much the same result if you rounded the resulting random variables to the nearest integer.

One way of seeing that the averages do not approach a stable distribution is to look at how their quartiles change as $n$ increases: they should stabilise (the first, second and third quartiles of a standard Cauchy distribution are $-1,0,+1$).

Here is a simulated example using R, with $10^5$ cases and $n$ going up to $1000$ with a power of $1$ (i.e. rounded Cauchy random variables), where the stability is clear after allowing for moderate simulation noise.

rx <- function(cases, pow){
  x <- rcauchy(cases)
  round(abs(x)^pow * sign(x))
  }

quartiles <- function(maxn, cases, pow){
  aver <- numeric(cases)
  quartilematrix <- matrix(numeric(3*maxn), ncol=3)
  for (n in 1:maxn){
    aver <- aver + (rx(cases, pow) - aver) / n
    quartilematrix[n,] <- quantile(aver, c(1/4,1/2,3/4))
    }
  quartilematrix
  }

set.seed(2025)
matplot(quartiles(maxn=1000, cases=10^5, pow=1), 
  pch=".", main="Quartiles of averages", xlab="n", ylab="quartile")

Quartiles of averages

Looking instead at a power of $5$ produces a very different picture of the quartiles, illustrating that heavier-than-Cauchy tails here mean that the distribution of the averages keeps widening as $n$ increases.

matplot(quartiles(maxn=1000, cases=10^5, pow=5), 
  pch=".", main="Quartiles of averages", xlab="n", ylab="quartile")

Quartiles of averages

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.