0
$\begingroup$

Suppose $X_i$ are i.i.d. and have density $f_\theta(x) = \frac{1}{\theta}$ if $x \in (\theta, 2\theta)$ for positive $\theta$.

  • $(\min_iX_i, \max_iX_i)$ is a sufficient statistic for $\theta$?

To apply the Rao-Blackwell theorem to an unbiased estimator, I start with $aX_i$. We want $\mathbb{E}aX_i = \theta$.

$$ \begin{align} \theta &= a\int_\theta^{2\theta}x\frac{1}{\theta}\mathrm{d}x \\ &= \frac{a}{2\theta}[x ^ 2]_\theta^{2\theta} \\ &= \frac{3a}{2}\theta \\ a &= \frac{2}{3} \end{align} $$

  • The Rao-Blackwell theorem implies finding $\frac{2}{3}\mathbb{E}(X_j | (\min_iX_i, \max_iX_i))$?
  • Do I integrate between $\min_iX_i$ and $2\min_iX_i$ or between $\frac{1}{2}\max_iX_i$ and $\max_iX_i$? Is the idea to integrate without $\theta$? Otherwise, the estimator will be a function of both $\theta$ as well as $(\min_iX_i, \max_iX_i)$.
$\endgroup$
6
  • $\begingroup$ It looks like that the title is not consistent with what you are really looking for (finding the UMVUE of $\theta$ using the Rao-Blackwell theorem?). $\endgroup$ Commented Dec 2, 2023 at 13:32
  • 1
    $\begingroup$ @Zhanxiong Because $(X_{(1)},X_{(n)})$ is not complete, talking about UMVUE here is problematic. $\endgroup$ Commented Dec 2, 2023 at 19:18
  • $\begingroup$ This was also answered at stats.stackexchange.com/q/395745/119261. $\endgroup$ Commented Dec 2, 2023 at 19:20
  • $\begingroup$ @StubbornAtom I think you are right, so I modified my answer. Do you have a link to show that $(X_{(1)}, X_{(n)})$ are not complete? $\endgroup$ Commented Dec 2, 2023 at 19:31
  • $\begingroup$ @Zhanxiong I think finding their expectations would easily give a non-zero function of the min and max that has mean $0$. $\endgroup$ Commented Dec 2, 2023 at 19:39

1 Answer 1

1
$\begingroup$

Your core question is to determine $E[X_j | X_{(1)}, X_{(n)}]$, where $X_{(1)} = \min_{1 \leq i \leq n}X_i$, $X_{(n)} = \max_{1 \leq i \leq n}X_i$.

There is an intuitive argument as follows (cf., Example 34.4, Probability and Measure (3rd edition)): having observed $X_{(1)}$ and $X_{(n)}$, for any $j \in \{1, \ldots, n\}$, $X_j$ has $n^{-1}$ probability to be $X_{(1)}$, $n^{-1}$ probability to be $X_{(n)}$, and $1 - \frac{2}{n}$ probability to be uniformly distributed on the interval $[X_{(1)}, X_{(n)}]$, therefore \begin{align*} E[X_j|X_{(1)}, X_{(n)}] = \frac{1}{n}X_{(1)} + \frac{1}{n}X_{(n)} + \left(1 - \frac{2}{n}\right)\times \frac{X_{(1)} + X_{(n)}}{2} = \frac{X_{(1)} + X_{(n)}}{2}. \tag{1}\label{1} \end{align*}

For a rigorous proof of $\eqref{1}$, note that for $s, t$ such that $\theta < s < t < 2\theta$, we have \begin{align*} & \int_{X_{(1)} > s, X_{(n)} \leq t}X_jdP_\theta \\ =& \int_0^\infty P_\theta\left[X_j \geq u, X_{(1)} > s, X_{(n)} \leq t\right]du \\ =& \int_0^s\left(\frac{t - s}{\theta}\right)^ndu + \int_s^t \frac{t - u}{\theta}\left(\frac{t - s}{\theta}\right)^{n - 1}du \\ =& s\frac{(t - s)^{n}}{\theta^n} + \frac{(t - s)^{n + 1}}{2\theta^n}. \tag{2}\label{2} \end{align*} On the other hand, since the joint density of $(X_{(1)}, X_{(n)})$ is \begin{align*} f_{1n}(x, y) = \begin{cases} n(n - 1)\left(\frac{y - x}{\theta}\right)^{n - 2}\frac{1}{\theta^2}, & \theta < x < y < 2\theta; \\ 0 & \text{ otherwise }, \end{cases} \end{align*} it follows that \begin{align*} & \int_{X_{(1)} > s, X_{(n)} \leq t}(X_{(1)} + X_{(n)})dP_\theta \\ =& \iint_{(x, y): s < x < y \leq t}(x + y)n(n - 1)\left(\frac{y - x}{\theta}\right)^{n - 2}\frac{1}{\theta^2} dxdy \\ =& \frac{(t - s)^{n + 1}}{\theta^n} + \frac{2s(t - s)^{n}}{\theta^n}. \tag{3}\label{3} \end{align*}

$\eqref{2}$ and $\eqref{3}$ together imply that $\int_G X_jdP_\theta = \int_G\frac{X_{(1)} + X_{(n)}}{2}dP_\theta, G \in \sigma(X_{(1)}, X_{(n)})$, hence $\eqref{1}$ holds.

Substituting $s = \theta$ and $t = 2\theta$ in $\eqref{3}$ yields $E[X_{(1)} + X_{(n)}] = 3\theta$. By $\eqref{1}$ and Rao-Blackwell theorem, $\frac{1}{3}(X_{(1)} + X_{(n)})$ is an unbiased estimator of $\theta$ which has smaller variance compared to the pre-conditioning unbiased estimator $\frac{2}{3}X_j$.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.