Skip to main content

Questions tagged [moment-generating-functions]

For questions relating to moment-generating-functions (m.g.f.), which are a way to find moments like the mean$~(μ)~$ and the variance$~(σ^2)~$. Finding an m.g.f. for a discrete random variable involves summation; for continuous random variables, calculus is used.

Filter by
Sorted by
Tagged with
5 votes
1 answer
290 views

Let $X$ be a real-valued random variable, and define its moment generating function (MGF) as $$ M_X(s) = \mathbb{E}[e^{sX}], $$ where $\mathbb{E}[\cdot]$ denotes the expected value of the random ...
Anne's user avatar
  • 191
0 votes
0 answers
35 views

Sub-Gaussian concentration for reversible Markov chains with spectral gap Setup. Let $(X_i)_{i\ge1}$ be a stationary, $\pi$-reversible Markov chain on a measurable space with spectral gap $\gamma>0$...
ylefay's user avatar
  • 178
1 vote
1 answer
103 views

Suppose a real random variable $X$ has an upper bound $c > 0$ but has no lower bound. $\mathbb{E}[X] = 0$ and $\mathbb{E}[X^2] = \sigma^2$ ($\mathbb{E}[]$ denotes the expectation of random ...
user31587575's user avatar
4 votes
2 answers
118 views

This is exercise 1.1.4 from Tao's "Topics in Random Matrix Theory". It asks the reader to prove that a real-valued random variable $X$ is sub-Gaussian iff there exists such $C>0$ that $\...
Daigaku no Baku's user avatar
0 votes
1 answer
146 views

Let $(X,Y)$ be a random vector with joint moment generating function $$M(t_1,t_2) = \frac{1}{(1-(t_1+t_2))(1-t_2)}$$ Let $Z=X+Y$. Then, Var(Z) is equal to: (IIT JAM MS 2021, Q21) Using $M_{X+Y}(t) = ...
Starlight's user avatar
  • 2,674
1 vote
0 answers
44 views

The method of moments Wikipedia page says that odd moments of the sum $S_n$ of iid random variables $X_i$ (each mean 0 and variance 1) vanish. While even moments are all finite and have closed form, ...
Andras Vanyolos's user avatar
2 votes
0 answers
59 views

I am working with a multivariate normal distribution $\mathbf{x} = [x_1, x_2, \ldots, x_n] \sim \mathcal{N}(\mathbf{\mu}, \mathbf{\Sigma})$, and I need to compute the expectation $E[x_1 x_2 \cdots x_n]...
cloudmath's user avatar
2 votes
1 answer
251 views

e.g. Chebyshev's bound $P(|x-\mu| \geq a) \leqslant \frac{\sigma^2}{a^2}$ tells us the upper-bound probability that $X$ deviates from its mean (in absolute value) by more than a certain amount. I can'...
user avatar
0 votes
0 answers
48 views

Consider a random variable $X$ with full support and assume $$ \mathbb{E}(\max\{0,\exp(\beta X)\})^p<+\infty $$ for some $\beta>0$ and $p>0$. Does this imply that $$ \mathbb{E}(\max\{0,X\})^...
Star's user avatar
  • 414
3 votes
1 answer
257 views

I am trying to understand the proof behind why the Moment Generating Function (https://en.wikipedia.org/wiki/Moment-generating_function) of a random variable is is unique. For example, consider the ...
stats_noob's user avatar
  • 4,183
1 vote
1 answer
70 views

Suppose $X,Y$ are independent random variables, with respective moment generating functions (MGFs) $M_X(t)$ and $M_Y(t)$. It is known that the MGF of $XY$ is given by $$\int_{-\infty}^{\infty} \int_{-\...
Hello's user avatar
  • 2,237
4 votes
0 answers
83 views

I don't think the Probability and Random Variables course I studied really touched on multivariate Moment-Generating Functions at all, and Wikipedia appears surprisingly silent on this question. The ...
user10478's user avatar
  • 2,184
1 vote
0 answers
61 views

I was reading the the proof of CLT by MGF in this answer Let $Y_i$'s be i.i.d random variables with mean 0 and variance 1, and in the original answer, by Taylor expansion we have $$M_{Y_1}(s) = E[\exp(...
taylor's user avatar
  • 925
2 votes
1 answer
57 views

I am working on the following problem, and I am having trouble understanding why (a) is not a moment-generating other than it doesn't satisfy the general form of the MGF (i.e. $E[e^{tX}]$), and that (...
Harry Lofi's user avatar
-1 votes
1 answer
122 views

Calculate the moment generating function of the Borel distribution given by $P(X=x; \mu) = \frac{e^{-\mu x} (\mu x)^{x-1}}{x!}$ with $\mu \in (0,1)$ and $x = 1, 2, 3,\ldots$.
André Ferrari Castanheiro's user avatar
2 votes
1 answer
80 views

Let $W$ be a Brownian motion, $a>0$ and $b>0$ real constants. Let $\tau$ be the stopping time $$ \tau=\inf\left\{t\geq 0:W_t-W_0\geq +a\textrm{ or }W_t-W_0\leq -b\right\} $$ I need to compute $\...
AlmostSureUser's user avatar
6 votes
0 answers
155 views

Let $(a_k)_{k\in\mathbb{N}}$ be iid random variables distributed uniformly in $(-1, 1)$, and consider, for some fixed $r \in [0, 1)$, the limiting random variable $$ X = \lim_{n \to \infty} \sum_{k=0}^...
smalldog's user avatar
  • 1,883
2 votes
1 answer
447 views

Let $$X_t = X_0 + \mu t + \sigma W_t$$ where $W_t$ is a standard brownian motion, $\mu \in \mathbb{R}$ and $\sigma > 0 $. Let $\tau$ denote the first time of hitting an upward barrier, e.g.: $$ \...
rubikscube09's user avatar
  • 4,305
2 votes
1 answer
63 views

Definition Let X be a real value random variable. The moment-generating-function is defined by $M(t):=\int_{\mathbb{R}}e^{tx} dx$. Exercise: Suppose X,Y are independent random variables. Show that $...
NTc5's user avatar
  • 1,257
1 vote
0 answers
65 views

Let $X$ be a continuous real random variable. For $t \in \mathbb{R}$ and $i^2=-1$, $\mathbb{E}[e^{itX}]$ is the characteristic function of $X$. For $s \in \mathbb{R}$, $\mathbb{E}[e^{sX}]$ is the ...
Thomas Steinke's user avatar
3 votes
0 answers
43 views

Let $(X_n)$ be a sequence of random variables and let $X$ be another random variable. Assume that the MGFs of $X_n$ converges pointwise to the MGF of $X$ at positive points. That is, for all $t > 0$...
mathematico's user avatar
1 vote
1 answer
90 views

Let $S_n=\xi_1+\dots+\xi_n$ be a random walk. Suppose $\psi(\theta_0)=E[e^{\theta_0\xi_i}]=1$ for some negative $\theta_0$ and $\xi_i$ is not a constant. In this case, $X_n=\exp(\theta_0 S_n)$ is an ...
William Wang's user avatar
0 votes
1 answer
99 views

From Blitzstein and Hwang (2nd edition): Let X be the total from rolling 6 fair dice, and let X1, . . . , X6 be the individual rolls. What is P (X = 18)? The PGF of X lets us count the cases in a ...
monopoly's user avatar
  • 223
0 votes
1 answer
110 views

If one plots a bootstrap distribution of "resample means" obtained by resampling a single, real sample from a population distribution, the mean of that plot gives no more information than ...
user10478's user avatar
  • 2,184
6 votes
1 answer
693 views

Suppose we have two bounded non-negative random-variables $X,Y$. Suppose that $\mathbb{E}[X]=\mathbb{E}[Y]=1$ and their other moments obey $\mathbb{E}[X^m] \geq \mathbb{E}[Y^m]$ for all integer $m \...
nervxxx's user avatar
  • 375
0 votes
1 answer
179 views

I am reading in wiki that for a non-negative random variable X we get $$\operatorname P \left(X > a\right) \geq \sup_{t > 0 \text{ and } M(t) \geq e^{ta}} \left( 1 - \frac{e^{ta}}{M(t)} \right)^...
Thoth's user avatar
  • 975
1 vote
1 answer
63 views

For a Rademacher r.v. X that is defined as $$Pr[X = k]=\begin{cases}\frac{1}{2}, \text{ for k=1} \\ \frac{1}{2}, \text{ for k=-1} \\ 0, \text{ otherwise}\end{cases} \tag{1}$$ or alternatively as $$Pr[...
Thoth's user avatar
  • 975
1 vote
0 answers
37 views

Given that the moment generating function of the random variable X is $M_X(t)=\frac{1}{1-2t}$, calculate the expectation of $Y^n,n\in \mathbb{N} $, where $Y=365 \times 0.3^X$. Here is my attempt. Use ...
Starlight's user avatar
  • 2,674
1 vote
1 answer
76 views

Let $X_1, \ldots, X_n$ random variables i.i.d. with geometric distribution ($k \geq 1$) with parameter p. Define $Y_p := X_1 + \ldots + X_n$. Show that $p Y_p$ converges in distribution distribution ...
Wellington Silva's user avatar
0 votes
1 answer
97 views

From Taylor series, if I need to get the k-th moment, I need to find the k-th derivative of the Moment Generating function. If I have $X \sim \text{Uniform}(0,1)$, the MGF $M_{X}(s)$ is; $$M_{X}(s) = \...
moseskabungo's user avatar
1 vote
1 answer
75 views

While doing a study on the growth metrics RRRGR (Reverse of Relative of Relative Growth Curve), I'm stuck on the exact distribution of the growth metrics $ W_{ji} = ln \left( \dfrac{R_{ji} (t)}{R_{ji} ...
Pri's user avatar
  • 11
1 vote
0 answers
68 views

Let $S$ be a Poisson Binomial random variable, i.e. the sum of $n$ independent Bernoulli random variables with parameters $p_1, ..., p_n$. I'd like to know the expected value of $\frac{1}{1+S}$, ...
crf's user avatar
  • 5,699
0 votes
1 answer
204 views

At first, this looked like an easy task, starting with usual expression for variance, such as: $\operatorname{Var}(S^2)= E(S^4) - (E(S^2))^2$. Once again, herein I want to define $S^2=\frac{\sum_1^n (...
Michael Dosier's user avatar
0 votes
1 answer
50 views

$X$ and $Y$ are unbounded random variables, if $X^n$ and $Y^n$ are integrable for all n, and $E(X^n) = E(Y^n)$ for all $n\ge0$ must $X$ and $Y$ have the same distribution? My intuition says no. I’ve ...
edster101's user avatar
0 votes
1 answer
42 views

Is the method of moments an if an only if statement when the limiting distribution is the standard normal and the moments of the sequence are all finite? That is, suppose we have a sequence $X_n$ ...
Iced Palmer's user avatar
1 vote
0 answers
40 views

The generalized moment generating function (MGF) is defined (at least in some of the signal processing literature) by \begin{equation} M_X^{(r)}(t) \doteq {\mathbb E}(X^r\exp{(tX)}. \end{equation} The ...
user3236841's user avatar
1 vote
0 answers
67 views

I am trying to prove the following inequality for an arbitrary random variable $X$ and a constant $c$: $$P(X\ge c)\le \min_{s\ge 0}e^{-sc}M_X(s),$$ where $M_X(s)$ is the moment generating function of $...
prob1 yuma's user avatar
1 vote
0 answers
56 views

Let $(X, Y)$ be an absolutely continuous random vector and $M_{X, Y}$ be the bivariate moment generating function of $(X, Y)$. I want to show that $X$ and $Y$ are independent if and only if $$M_{X, Y}\...
Cyclotomic Manolo's user avatar
0 votes
1 answer
161 views

If $X$ is sub-Gaussian random variable with variance proxy $\sigma^2$, i.e., $E(X) = 0$ and $E\{ \exp(s X) \} \leq \exp( \frac{\sigma^2 s^2}{2} )$ for $\forall s \in \mathbb{R}$, then how to show that ...
Zifeng Zhang's user avatar
2 votes
0 answers
198 views

I am searching the definition of the $5^{th}$ unbiased cumulant estimate. Let $K_j$, be the $j$-th unbiased cumulant estimate of a probability distribution, based on the sample moments. Let $m_j$ be ...
stirling's user avatar
0 votes
1 answer
145 views

Let $X$ be a non-negative random variable bounded on $[0,1]$. Is it true that $\mathbb{E}[X^2]\leq k \mathbb{E}[X]^2$ for some constant $k$? If not, are there any minimal assumptions on $X$ where this ...
ryanriess's user avatar
  • 437
2 votes
2 answers
252 views

I recently had an assignment where I had to prove that, if $X,Y,Z$ are all independent random variables and $X+Z$ has the same distribution as $Y+Z$ then $X$ has the same distribution as $Y$. These ...
bears's user avatar
  • 123
8 votes
3 answers
797 views

For a random variable $X$, the cumulant generating function $CGF_X$ is defined as $CGF_X(t)=\log Ee^{tX}$, and the nth cumulant $k_n(X)$ is defined as the coefficient of $t^n/n!$ in the corresponding ...
Simon Segert's user avatar
  • 5,809
-1 votes
1 answer
102 views

I tried using the moment generating function. I got: $M_{X}(t)=E(e^{tX})=\Sigma_{n=0}^{\infty} (n+1)(3t)^{n}$. But I don't understand which distribution would it be.
Tas's user avatar
  • 135
0 votes
1 answer
75 views

Showing that $(e^{\lambda B_t - \frac{1}{2}\lambda^2t})_{t\geq 0}$ is a $\mathbb{R}$-valued martingale Let $B$ be a standard $\mathbb{R}$-valued Brownian motion and let $\lambda\in\mathbb{R}$. From $...
Wilfred Montoya's user avatar
1 vote
0 answers
105 views

Let $X \in \mathbb{R}^d$ be a zero-mean multivariate Gaussian, with independent components, i.e., $X \sim \mathcal{N}(0,\Sigma)$, with $\Sigma = diag(\sigma_1^2,\ldots,\sigma_d^2)$ a diagonal matrix. ...
funny_name's user avatar
0 votes
1 answer
82 views

Let $X_n$ be the size of the $n$-th generation of a branching process, with family size probability generating function $G(s)$. Let $X_0 = 1$. Suppose the family-mass size mass function is $P(X_1 = k) ...
Timothy Ho's user avatar
0 votes
1 answer
57 views

Let $X$ be a non-negative r.v., prove that \begin{equation} \inf_{k\in\mathbb{Z}_+} \frac{\mathbb{E}[X^k]}{t^k}\leqslant \inf_{\lambda\geqslant 0}\frac{\mathbb{E}[e^{\lambda X}]}{e^{\lambda t}},\;\...
o.spectrum's user avatar
  • 1,411
1 vote
0 answers
135 views

Question Define $\mathbf{a} = (a_1, \ldots, a_p)$ where $p$ is a positive integer and the $a_l$ are i.i.d $\text{Uniform}(-1,1)$ random variables. Fix a unit vector $w \in \mathbb{R}^p$. Consider the ...
Balkys's user avatar
  • 761
1 vote
0 answers
29 views

Suppose I have $N$ iid random vectors $\sigma_1,\ldots,\sigma_N$ that are uniformly distributed in $S^1$. Let $\bar{\sigma}_N:=N^{-1}(\sigma_1+\cdots+\sigma_N)$ denote the sample average. I am ...
Nik Quine's user avatar
  • 571

1
2 3 4 5
28