Skip to main content

Questions tagged [law-of-large-numbers]

Several theorems stating that sample mean converges to the expected value as $n\to\infty$. There is a weak law and a strong law of large numbers.

Filter by
Sorted by
Tagged with
-4 votes
1 answer
60 views

The Statistical Null Hypothesis Significance Test (NHST) shows that a 12 member jury unanimous vote passes the test at 95.45% Confidence Level, but fails to pass the test at 99.99% Confidence Level ...
jamil kazoun's user avatar
4 votes
1 answer
125 views

In Newey & McFadden (1994), Large Sample Estimation and Hypothesis Testing (Handbook of Econometrics, Ch. 36), they extend ULLN results from i.i.d. data to stationary ergodic sequences, e.g.: “...
spie227's user avatar
  • 242
0 votes
1 answer
78 views

My Set-Up: In Wooldridge (1994), Theorem 4.2 ("UWLLN for the heterogeneous case") gives conditions for the validity of the uniform weak law of large numbers (UWLLN) for the loss function for ...
Red's user avatar
  • 367
2 votes
2 answers
195 views

I'm trying to clarify my understanding of how convergence is defined in two different settings in probability theory: Law of Large Numbers (LLN): We define the sample average as $$ x_n(\omega) = \...
spie227's user avatar
  • 242
0 votes
0 answers
38 views

I have a sequence of observations from random variables, which can be: IID (Independent and Identically Distributed), or Non-IID (Independent but not Identically Distributed). If I apply the same ...
spie227's user avatar
  • 242
2 votes
1 answer
157 views

I have the following situation. Let $f:\mathbb{R}^p \times \Theta \to \mathbb{R}$ a measurable function. Moreover, let $X_n$ be a sequence of real-valued random vectors. I know that the function ...
Treebeard's user avatar
0 votes
0 answers
58 views

hello guys in my book there is an inequality which i do not understand (line 2 in the image). they want to prove the weak law of large numbers it says that i can take any k that is greater then... but ...
Roei Nahmany's user avatar
2 votes
1 answer
102 views

Let's say I have some random variables, $X_1, X_2, ...$ which are identically distributed with finite expected value, $E[X_1]$ and say they satisfy the requirements of the law of large numbers. Let $$\...
roundsquare's user avatar
1 vote
1 answer
95 views

Lets say I have a probability space with random variables X1,X2,.... These random variables have a parameter Θ. Given Θ, X1,X2,... are iid. This implies that conditional on Θ, the sample mean ...
SJ _88's user avatar
  • 41
1 vote
0 answers
63 views

In the body literature of Association Rule Mining (apriori algorithm is one of them) there's a lot of information about te usage of many metrics, whithin them 'support'. Support is defined as the ...
Oscar Flores's user avatar
2 votes
1 answer
217 views

Let $\theta_N=\frac{1}{N}\sum_{i=1}^N \pi_i\cdot g_i$ where $0<\pi_i<1$ and $0<g_i<1/\pi_i$ such that $\theta_N\overset{N\rightarrow \infty}{\rightarrow}\theta$. If $X_i\sim Ber(\pi_i)$, I ...
Pierfrancesco Alaimo Di Loro's user avatar
2 votes
1 answer
101 views

Let $(X_i, Y_i)_{i=1}^{\infty}$ be iid continuous random vectors with continuous joint density, where $X_1$ have support $\mathcal{X}$. Let $B_n\subset \mathcal{X}\subset\mathbb{R}$ be decreasing ...
Albert Paradek's user avatar
2 votes
0 answers
82 views

I'm trying to prove that: Given a sequence $(X_n)_{n \geq 1}$ of independent and identically distributed random variables, $E(X_i^2) < +\infty$ for all $i \geq 1$, then $$\frac{2}{n(n-1)}\sum\...
Konstante's user avatar
  • 121
5 votes
1 answer
610 views

Here are two popular principles in Statistics: 1) Law of Large Numbers: If $X$ is a random variable with a probability density function $f(x)$ and an expected value $E[X] = \mu$. If we take a sample ...
Uk rain troll's user avatar
3 votes
1 answer
125 views

Let us have an AR(1) model with individual efect $$y_t = \alpha + \theta y_{t-1} + \varepsilon_t$$ with $|\theta|<1$ for stacionarity and $\varepsilon_i$ i.i.d. from distribution with mean $0$ and ...
David Paleček's user avatar
1 vote
0 answers
91 views

I am working with a function which has the form $f(X_1, \dots, X_n)$, where $\\{X_n\\}$ is an ergodic stationary process. Theorem 5.6 in "A first course in stochastic processes" by Karlin &...
Kristan's user avatar
  • 11
1 vote
0 answers
51 views

This is a question about the convergence in probability and boundness in probability. Suppose $X_i \overset{\textrm{i.i.d.}}{\sim} (\mu, \sigma^2 )$ for $i=1,2, \cdots, n$. Denote $\overline{X}$ and $\...
d8g3n1v9's user avatar
  • 111
6 votes
3 answers
604 views

Let $X_i$ be iid random variables. How does one show that $$ \frac{1}{n(n-1)}\sum_{i\neq j}^n \sin\left(X_i X_j\right) $$ converges almost surely to a constant?
Yashaswi Mohanty's user avatar
1 vote
1 answer
125 views

Suppose we have observations $x_1, x_2, \ldots, x_n$ where $n$ is very large. Now we standardize the observations as $$y_i=\frac{x_i-\bar{x}}{\frac{s}{\sqrt{n}}},$$ where $s=\frac{\sum\limits_{i=1}^n(...
user2545's user avatar
  • 217
1 vote
0 answers
56 views

Q1 part A&B I have so far $$\underset{n\rightarrow\infty} {\lim} \frac{1}{n}\sum_{i=1}^nI(T_i>x)$$ since we are summing an indicator variable we can say it has a Bernoulli distribution with ...
laxfan1212's user avatar
2 votes
1 answer
236 views

I recently watched an episode of "The Big Bang Theory" where Sheldon makes a comment about the Law of Large Numbers. In the episode, Sheldon realizes he needs eggs, and almost immediately ...
Daria's user avatar
  • 23
3 votes
1 answer
117 views

This has probably been asked before, as this is (I think) a fundamental theory of statistical theory, but I don't know what it is called, hence I have not yet found an answer. Consider a box which ...
user3728501's user avatar
3 votes
2 answers
185 views

Both CLT and LLN are stated in terms of a fixed probability space that admits an infinite sequence of IID RVs. It is a common-place in many probability and statistics texts/notes that such a sequence ...
ac1501's user avatar
  • 105
3 votes
2 answers
256 views

I am a bit confused by Classical CLT section of the central limit theorem on Wikipedia. It basically says at the sample size gets larger, the difference between the sample mean and true mean ...
Sam's user avatar
  • 413
2 votes
1 answer
394 views

Suppose $Y_i=X_i'\beta+\epsilon_i$ with $E(\epsilon_i|X_i)=0$. Consider the usual OLS estimator for $\beta$ using a random sample $\{X_i,Y_i\}_{i=1}^n$: $\widehat{\beta}=(\frac{1}{n}\sum_{i=1}^nX_iX_i'...
ExcitedSnail's user avatar
  • 3,090
1 vote
0 answers
54 views

Given a set of IID samples $X = \{x_i\}_{i=1}^n$ assumed to be from the density $p(\cdot)$, and the function $h:\mathbb{R} \xrightarrow{}\mathbb{R}$, its expectation can be approximated as $$\mathbb{E}...
1809's user avatar
  • 11
1 vote
0 answers
115 views

The answer to "Do we have to tune the number of trees in a random forest?" suggests using as large a number of trees in a forest as possible. Is there a rule-of-thumb for choosing this "...
noob's user avatar
  • 2,620
1 vote
0 answers
67 views

SLLN tells us that if $X_1,...,X_n$ are iid, with $X_1$ having finite mean $\mu$, then their sample average converges almost surely to $\mu$. Suppose instead we know that $X_1,...,X_n$ are iid and ...
Golden_Ratio's user avatar
3 votes
1 answer
294 views

WLLN tells us that if $X_1,...,X_n$ are iid, with $X_1$ having finite mean $\mu$, then their sample average converges in probability to $\mu$. Suppose instead we know that $X_1,...,X_n$ are iid and ...
Golden_Ratio's user avatar
0 votes
0 answers
130 views

Suppose we have a set A , we split into multiple disjoint subsets ai We only have access to the ai sets , is there a way to compute the ECDF for the set A without looking at it ? If for example we ...
Amine boujida's user avatar
8 votes
2 answers
2k views

I was wondering if my intuition behind the weak law (WLLN) and strong law of large numbers (SLLN) is correct. The WLLN says that, if you consider a sequence $X_1,X_2,...$,of $i.i.d.$ random variables ...
John M.'s user avatar
  • 449
7 votes
3 answers
329 views

Can you tell me if my understanding of the CLT is correct? Maybe it's just a matter of notation. The classical CLT states: Let $X_1,...,X_i,...,X_n$ be a sequence of iid random variables drawn from a ...
John M.'s user avatar
  • 449
1 vote
0 answers
310 views

I am kind of new to this matrix notation and properties so I would like to see the algebraic part of the solution it helps to understand so I appreciate your understanding. My question is basically: ...
Tatanik501's user avatar
0 votes
0 answers
53 views

I have a data set with lots of small integer values and occasional large integers. For instance 1,1,1,3,2,1,320,2,3,4. I would like to scale my outlier values such that I can perform regression on my ...
murage kibicho's user avatar
2 votes
1 answer
209 views

Let $X_1, X_2, \ldots $ be an infinite sequence of i.i.d. random variables with $E(X_i)=\mu$ and $\mbox{Var}(X_i) < \infty$. The law of large numbers states $\lim_{n \rightarrow \infty} \sum_{i=1}^{...
GCru's user avatar
  • 301
0 votes
0 answers
613 views

In wikipedia, the law of large numbers is defined as follows : "The average of the results obtained from a large number of trials should be close to the expected value and tends to become closer ...
HYL's user avatar
  • 377
1 vote
0 answers
489 views

Intuitively, these two important statistical principles appear to describe two facets of the same phenomenon, namely that in the long run, any extreme occurrences get counter-balanced, and things tend ...
z8080's user avatar
  • 2,372
2 votes
0 answers
32 views

If a coin comes up Heads 90 times out of the first 100 tosses, should one expect Tails to make a comeback over the next 100? Reference from this page: https://www.financialwisdomforum.org/gummy-stuff/...
Ehsan's user avatar
  • 21
9 votes
2 answers
271 views

Edit Sep 19 this answer on Mathoverflow matches simulation results Suppose $x_i$ come from 2-d standard Normal centered at 0. What is the range of $a$ for which the following iteration converges ...
Yaroslav Bulatov's user avatar
0 votes
0 answers
209 views

Im sorry for asking a newbie question. The Central limit Theorem (CLT) states that when sample size tends to infinity, the sample mean will be normally distributed, and the variance is decreasing ($\...
user900476's user avatar
1 vote
0 answers
71 views

Let $X \sim \mathrm{Bernoulli}(\vartheta)$ for some unknown $\vartheta \in (0,1)$, and let $(X_1, …, X_n)$ be a moderately large IID sample for $X$. Let $\vartheta_0 \in (0,1)$. I want to test $H_0 \...
Federico's user avatar
  • 111
1 vote
0 answers
707 views

This question stems from the WLLN and the Central Limit Theorem. Suppose we have $n$ iid random samples $X_1,\ldots,X_n$ with common mean $\mu$ and finite variance $\sigma^2$. Then the sample mean $\...
zt wang's user avatar
  • 131
0 votes
0 answers
104 views

Law of large numbers states that: If $X_1, \dots X_n \sim p(x)$ are IID, then $ \frac{1}{n} \sum_{i=1}^{n} X_i \rightarrow \mathbb{E}_{p(x)}\{X\}= \mu$, where $X \sim p(x)$. Below is what I'm having ...
Aditya Mehrotra's user avatar
2 votes
0 answers
61 views

I am trying to prove the following: So far I have used Kronecker's Lemma as such: \begin{equation} \tag{1} \text{Since } \sum_{i=1}^{\infty} \frac{\sigma_i ^2}{B_i ^2} < \infty, \text{ then, } \...
Amanda_Sterling's user avatar
0 votes
0 answers
56 views

This question is specifically in reference to how statistics apply to Catan. For background, two fair dice are rolled and rolling a 7 is a bad event. Given that the average number of turns for a game ...
wya's user avatar
  • 1
0 votes
0 answers
71 views

I am reading an introductory econometrics book and I am having trouble understanding how they "directly applied the law of large numbers". Basically, they consider the case of simple linear ...
kuchejdatomas's user avatar
1 vote
0 answers
135 views

The question I'm working on says: Let $X_1, X_2, \cdots$ be iid random variables each with mean $\mu$ and variance $\sigma^2$. a) Determine $$ \lim\limits_{n \to \infty} \frac{X_1^2 + \cdots + X_n^2}{...
russloewe's user avatar
35 votes
5 answers
5k views

Suppose we have any kind of coin. Why should the relative frequency of getting a heads converge to any value at all? One answer is that this is simply what we empirically observe this to be the case, ...
MaximusIdeal's user avatar
1 vote
0 answers
81 views

Let $h$ be some bounded non-negative function. Assume that some random quantity $\mu^N (h)$ be some random quantity with almost sure limit $\mu(h) > 0$. For instance we could have $\mu^N(h) = N^{-1}...
yprobnoob's user avatar
  • 141
0 votes
1 answer
65 views

I was wondering if this point estimate for mean: $\frac{1}{n+1}\sum_{i = 1}^{n}x_i$ is biased? My first thought was that $\frac{1}{n+1}\sum_{i = 1}^{n}x_i \neq \frac{1}{n}\sum_{i = 1}^{n}x_i$, so then ...
Satan Lucifer's user avatar