Questions tagged [moment-generating-functions]
For questions relating to moment-generating-functions (m.g.f.), which are a way to find moments like the mean$~(μ)~$ and the variance$~(σ^2)~$. Finding an m.g.f. for a discrete random variable involves summation; for continuous random variables, calculus is used.
1,358 questions
5
votes
1
answer
290
views
Question regarding Jensen's inequality when it comes to logarithm
Let $X$ be a real-valued random variable, and define its moment generating function (MGF) as
$$
M_X(s) = \mathbb{E}[e^{sX}],
$$
where $\mathbb{E}[\cdot]$ denotes the expected value of the random ...
0
votes
0
answers
35
views
Concentration for Markov chain with spectral gap
Sub-Gaussian concentration for reversible Markov chains with spectral gap
Setup.
Let $(X_i)_{i\ge1}$ be a stationary, $\pi$-reversible Markov chain on a measurable space with spectral gap $\gamma>0$...
1
vote
1
answer
103
views
An upper bound for the moment generating function $f(t) = \mathbb{E}[e^{tX}]$ when the mean, variance and upper bound of $X$ are given
Suppose a real random variable $X$ has an upper bound $c > 0$ but has no lower bound. $\mathbb{E}[X] = 0$ and $\mathbb{E}[X^2] = \sigma^2$ ($\mathbb{E}[]$ denotes the expectation of random ...
4
votes
2
answers
118
views
A random variable $X$ is sub-Gaussian iff $\exists C>0: \mathbb{E}[e^{Xt}]\le C\exp(Ct^2)$
This is exercise 1.1.4 from Tao's "Topics in Random Matrix Theory". It asks the reader to prove that a real-valued random variable $X$ is sub-Gaussian iff there exists such $C>0$ that $\...
0
votes
1
answer
146
views
Can two non independent random variables be split into independent ones?
Let $(X,Y)$ be a random vector with joint moment generating function
$$M(t_1,t_2) = \frac{1}{(1-(t_1+t_2))(1-t_2)}$$ Let $Z=X+Y$. Then,
Var(Z) is equal to: (IIT JAM MS 2021, Q21)
Using $M_{X+Y}(t) = ...
1
vote
0
answers
44
views
Moments of a sum of iid random variables with mean 0 and variance 1
The method of moments Wikipedia page says that odd moments of the sum $S_n$ of iid random variables $X_i$ (each mean 0 and variance 1) vanish. While even moments are all finite and have closed form, ...
2
votes
0
answers
59
views
What is the most efficient algorithm for computing $E[x_1 x_2 \cdots x_n]$ in a multivariate normal distribution? [closed]
I am working with a multivariate normal distribution $\mathbf{x} = [x_1, x_2, \ldots, x_n] \sim \mathcal{N}(\mathbf{\mu}, \mathbf{\Sigma})$, and I need to compute the expectation $E[x_1 x_2 \cdots x_n]...
2
votes
1
answer
251
views
What does Chernoff bound want to point out? Intuition
e.g. Chebyshev's bound $P(|x-\mu| \geq a) \leqslant \frac{\sigma^2}{a^2}$ tells us the upper-bound probability that $X$ deviates from its mean (in absolute value) by more than a certain amount.
I can'...
0
votes
0
answers
48
views
On the finiteness of moments
Consider a random variable $X$ with full support and assume
$$
\mathbb{E}(\max\{0,\exp(\beta X)\})^p<+\infty
$$
for some $\beta>0$ and $p>0$.
Does this imply that
$$
\mathbb{E}(\max\{0,X\})^...
3
votes
1
answer
257
views
Proof that a moment generating function is unique
I am trying to understand the proof behind why the Moment Generating Function (https://en.wikipedia.org/wiki/Moment-generating_function) of a random variable is is unique.
For example, consider the ...
1
vote
1
answer
70
views
Moment Generating Function of a Product of Random Variables
Suppose $X,Y$ are independent random variables, with respective moment generating functions (MGFs) $M_X(t)$ and $M_Y(t)$. It is known that the MGF of $XY$ is given by
$$\int_{-\infty}^{\infty} \int_{-\...
4
votes
0
answers
83
views
What "methods" does a multivariate Moment-Generating Function provide?
I don't think the Probability and Random Variables course I studied really touched on multivariate Moment-Generating Functions at all, and Wikipedia appears surprisingly silent on this question. The ...
1
vote
0
answers
61
views
Proof of central limit theorem by moment generating functions and the assumption on $X_i$
I was reading the the proof of CLT by MGF in this answer
Let $Y_i$'s be i.i.d random variables with mean 0 and variance 1, and in the original answer, by Taylor expansion we have
$$M_{Y_1}(s) = E[\exp(...
2
votes
1
answer
57
views
Identifying a moment generating function.
I am working on the following problem, and I am having trouble understanding why (a) is not a moment-generating other than it doesn't satisfy the general form of the MGF (i.e. $E[e^{tX}]$), and that (...
-1
votes
1
answer
122
views
moment generating function of the Borel distribution [closed]
Calculate the moment generating function of the Borel distribution given by $P(X=x; \mu) = \frac{e^{-\mu x} (\mu x)^{x-1}}{x!}$
with $\mu \in (0,1)$ and $x = 1, 2, 3,\ldots$.
2
votes
1
answer
80
views
Compute $\mathbb{E}\left[\tau^2\right]$ when $\tau=\inf\{t>0|W_t-W_0\geq +a\textrm{ or }W_t-W_0\leq -b\}$
Let $W$ be a Brownian motion, $a>0$ and $b>0$ real constants. Let $\tau$ be the stopping time
$$
\tau=\inf\left\{t\geq 0:W_t-W_0\geq +a\textrm{ or }W_t-W_0\leq -b\right\}
$$
I need to compute $\...
6
votes
0
answers
155
views
What is the expectation of this power series of random variables?
Let $(a_k)_{k\in\mathbb{N}}$ be iid random variables distributed uniformly in $(-1, 1)$, and consider, for some fixed $r \in [0, 1)$, the limiting random variable
$$ X = \lim_{n \to \infty} \sum_{k=0}^...
2
votes
1
answer
447
views
Laplace Transform of Hitting Time of Brownian Motion with Negative Drift
Let $$X_t = X_0 + \mu t + \sigma W_t$$ where $W_t$ is a standard brownian motion, $\mu \in \mathbb{R}$ and $\sigma > 0 $. Let $\tau$ denote the first time of hitting an upward barrier, e.g.:
$$
\...
2
votes
1
answer
63
views
Exercise: Show a property of moment generating functions
Definition
Let X be a real value random variable. The moment-generating-function is defined by
$M(t):=\int_{\mathbb{R}}e^{tx} dx$.
Exercise:
Suppose X,Y are independent random variables. Show that $...
1
vote
0
answers
65
views
Numerically computing CDF from MGF + CF
Let $X$ be a continuous real random variable.
For $t \in \mathbb{R}$ and $i^2=-1$, $\mathbb{E}[e^{itX}]$ is the characteristic function of $X$.
For $s \in \mathbb{R}$, $\mathbb{E}[e^{sX}]$ is the ...
3
votes
0
answers
43
views
Is pointwise convergence of MGFs at positive points enough to determine convergence in distribution?
Let $(X_n)$ be a sequence of random variables and let $X$ be another random variable. Assume that the MGFs of $X_n$ converges pointwise to the MGF of $X$ at positive points. That is, for all $t > 0$...
1
vote
1
answer
90
views
Durrett Exercise 4.8.8 (Proving UI of exponential martingale with i.i.d. increment of general distribution)
Let $S_n=\xi_1+\dots+\xi_n$ be a random walk. Suppose $\psi(\theta_0)=E[e^{\theta_0\xi_i}]=1$ for some negative $\theta_0$ and $\xi_i$ is not a constant. In this case, $X_n=\exp(\theta_0 S_n)$ is an ...
0
votes
1
answer
99
views
Probability generating function to get the total of rolling 6 fair dice.
From Blitzstein and Hwang (2nd edition):
Let X be the total from rolling 6 fair dice, and let X1, . . . , X6 be the individual
rolls. What is P (X = 18)?
The PGF of X lets us count the cases in a ...
0
votes
1
answer
110
views
Intuition for Why Resampling Should Create New Information
If one plots a bootstrap distribution of "resample means" obtained by resampling a single, real sample from a population distribution, the mean of that plot gives no more information than ...
6
votes
1
answer
693
views
Inequality involving fractional moments of distributions
Suppose we have two bounded non-negative random-variables $X,Y$. Suppose that $\mathbb{E}[X]=\mathbb{E}[Y]=1$ and their other moments obey $\mathbb{E}[X^m] \geq \mathbb{E}[Y^m]$ for all integer $m \...
0
votes
1
answer
179
views
Lower bound on $P \left(X > a\right) $ using Paley–Zygmund inequality
I am reading in wiki that for a non-negative random variable X we get
$$\operatorname P \left(X > a\right) \geq \sup_{t > 0 \text{ and } M(t) \geq e^{ta}} \left( 1 - \frac{e^{ta}}{M(t)} \right)^...
1
vote
1
answer
63
views
Conditional MGF on Rademacher distribution
For a Rademacher r.v. X that is defined as
$$Pr[X = k]=\begin{cases}\frac{1}{2}, \text{ for k=1} \\ \frac{1}{2}, \text{ for k=-1} \\ 0, \text{ otherwise}\end{cases} \tag{1}$$ or alternatively as
$$Pr[...
1
vote
0
answers
37
views
Calculating expectation by manipulating into the MGF
Given that the moment generating function of the random variable X is
$M_X(t)=\frac{1}{1-2t}$, calculate the expectation of $Y^n,n\in \mathbb{N} $, where
$Y=365 \times 0.3^X$.
Here is my attempt. Use ...
1
vote
1
answer
76
views
Convergence of sum of geometric random variables to a gamma distribution
Let $X_1, \ldots, X_n$ random variables i.i.d. with geometric distribution ($k \geq 1$) with parameter p. Define $Y_p := X_1 + \ldots + X_n$. Show that $p Y_p$ converges in distribution distribution ...
0
votes
1
answer
97
views
Finding Expectation of a Uniform random variable from its moment generating function
From Taylor series, if I need to get the k-th moment, I need to find the k-th derivative of the Moment Generating function.
If I have $X \sim \text{Uniform}(0,1)$, the MGF $M_{X}(s)$ is; $$M_{X}(s) = \...
1
vote
1
answer
75
views
Joint Central Moments of Bivariate Exponential Distribution
While doing a study on the growth metrics RRRGR (Reverse of Relative of Relative Growth Curve), I'm stuck on the exact distribution of the growth metrics $ W_{ji} = ln \left( \dfrac{R_{ji} (t)}{R_{ji} ...
1
vote
0
answers
68
views
Calculating Expected Value of 1/(S+1) for a Poisson Binomial Random Variable
Let $S$ be a Poisson Binomial random variable, i.e. the sum of $n$ independent Bernoulli random variables with parameters $p_1, ..., p_n$. I'd like to know the expected value of $\frac{1}{1+S}$, ...
0
votes
1
answer
204
views
What is the value of $\operatorname{Var}(S^2)$, when $S^2$ is given as sample variance with denominator $n$ instead of $(n-1)$?
At first, this looked like an easy task, starting with usual expression for variance, such as:
$\operatorname{Var}(S^2)= E(S^4) - (E(S^2))^2$. Once again, herein I want to define $S^2=\frac{\sum_1^n (...
0
votes
1
answer
50
views
If X and Y are unbounded rvs that have the same moment generating function do they have the same distribution
$X$ and $Y$ are unbounded random variables, if $X^n$ and $Y^n$ are integrable for all n, and $E(X^n) = E(Y^n)$ for all $n\ge0$ must $X$ and $Y$ have the same distribution?
My intuition says no. I’ve ...
0
votes
1
answer
42
views
If $\left(X_{n}\right)_{n}\to N\left(0,1\right)$ in Distribution, then moments converge to moments of normal
Is the method of moments an if an only if statement when the limiting distribution is the standard normal and the moments of the sequence are all finite?
That is, suppose we have a sequence $X_n$ ...
1
vote
0
answers
40
views
generalized cumulant generating function?
The generalized moment generating function (MGF) is defined (at least in some of the signal processing literature) by
\begin{equation}
M_X^{(r)}(t) \doteq {\mathbb E}(X^r\exp{(tX)}.
\end{equation}
The ...
1
vote
0
answers
67
views
Proving an Inequality Involving the Moment Generating Function
I am trying to prove the following inequality for an arbitrary random variable $X$ and a constant $c$:
$$P(X\ge c)\le \min_{s\ge 0}e^{-sc}M_X(s),$$
where $M_X(s)$ is the moment generating function of $...
1
vote
0
answers
56
views
How to show that $X$ and $Y$ are independent without using the uniqueness of the MGF
Let $(X, Y)$ be an absolutely continuous random vector and $M_{X, Y}$ be the bivariate moment generating function of $(X, Y)$.
I want to show that $X$ and $Y$ are independent if and only if
$$M_{X, Y}\...
0
votes
1
answer
161
views
If $X$ is sub-Gaussian random variable with variance proxy $\sigma^2$, how to show that $E\{ \exp( t X^2) \} \leq (1 - 2 t \sigma^2)^{-1/2}$?
If $X$ is sub-Gaussian random variable with variance proxy $\sigma^2$, i.e., $E(X) = 0$ and $E\{ \exp(s X) \} \leq \exp( \frac{\sigma^2 s^2}{2} )$ for $\forall s \in \mathbb{R}$, then how to show that ...
2
votes
0
answers
198
views
Unbiased Cumulant Estimate - Fifth Cumulant
I am searching the definition of the $5^{th}$ unbiased cumulant estimate.
Let $K_j$, be the $j$-th unbiased cumulant estimate of a probability distribution, based on the sample moments.
Let $m_j$ be ...
0
votes
1
answer
145
views
$\mathbb{E}[X^2]\leq k \mathbb{E}[X]^2$, upper bound second moment from first moment
Let $X$ be a non-negative random variable bounded on $[0,1]$. Is it true that $\mathbb{E}[X^2]\leq k \mathbb{E}[X]^2$ for some constant $k$? If not, are there any minimal assumptions on $X$ where this ...
2
votes
2
answers
252
views
Quick question about MGFs; Is $M_X(t)$ always strictly positive?
I recently had an assignment where I had to prove that, if $X,Y,Z$ are all independent random variables and $X+Z$ has the same distribution as $Y+Z$ then $X$ has the same distribution as $Y$. These ...
8
votes
3
answers
797
views
Are cumulants the only additive functions of independent random variables?
For a random variable $X$, the cumulant generating function $CGF_X$ is defined as $CGF_X(t)=\log Ee^{tX}$, and the nth cumulant $k_n(X)$ is defined as the coefficient of $t^n/n!$ in the corresponding ...
-1
votes
1
answer
102
views
Suppose that $X$ is a random variable with $E(X^{n})=3^{n}(n+1)!$ for $n \geq 1$. Find the distribution of $X$. [closed]
I tried using the moment generating function. I got:
$M_{X}(t)=E(e^{tX})=\Sigma_{n=0}^{\infty} (n+1)(3t)^{n}$. But I don't understand which distribution would it be.
0
votes
1
answer
75
views
Is $(e^{i \lambda B_t + \frac{1}{2}\lambda^2t})_{t\geq 0}$ a martingale?
Showing that $(e^{\lambda B_t - \frac{1}{2}\lambda^2t})_{t\geq 0}$ is a $\mathbb{R}$-valued martingale
Let $B$ be a standard $\mathbb{R}$-valued Brownian motion and let $\lambda\in\mathbb{R}$. From $...
1
vote
0
answers
105
views
Moment generating function of squared norm of multivariate Gaussian
Let $X \in \mathbb{R}^d$ be a zero-mean multivariate Gaussian, with independent components, i.e., $X \sim \mathcal{N}(0,\Sigma)$, with $\Sigma = diag(\sigma_1^2,\ldots,\sigma_d^2)$ a diagonal matrix. ...
0
votes
1
answer
82
views
Moment Generating Functions and Probability
Let $X_n$ be the size of the $n$-th generation of a branching process, with family size probability generating function $G(s)$. Let $X_0 = 1$.
Suppose the family-mass size mass function is $P(X_1 = k) ...
0
votes
1
answer
57
views
An inequality involving infimums of the scaled $k$-th moment and of a scaled moment-generating function.
Let $X$ be a non-negative r.v., prove that
\begin{equation}
\inf_{k\in\mathbb{Z}_+} \frac{\mathbb{E}[X^k]}{t^k}\leqslant \inf_{\lambda\geqslant 0}\frac{\mathbb{E}[e^{\lambda X}]}{e^{\lambda t}},\;\...
1
vote
0
answers
135
views
Bound on cumulant generating function of a weighted sum of uniform random variables
Question
Define $\mathbf{a} = (a_1, \ldots, a_p)$ where $p$ is a positive
integer and the $a_l$ are i.i.d $\text{Uniform}(-1,1)$ random
variables. Fix a unit vector $w \in \mathbb{R}^p$. Consider the
...
1
vote
0
answers
29
views
Convergence of MGF of squared norm of sum of iid unit vectors
Suppose I have $N$ iid random vectors $\sigma_1,\ldots,\sigma_N$ that are uniformly distributed in $S^1$. Let $\bar{\sigma}_N:=N^{-1}(\sigma_1+\cdots+\sigma_N)$ denote the sample average. I am ...