5
$\begingroup$

Let $a$ and $p$ be positive integers, and consider the polynomial $$(1+x+\cdots+x^{p-1})^a = \sum_{i=0}^{a(p-1)} a_ix^i.$$ I'm looking for an asymptotic estimate of $\sum_{i=0}^{b} a_i$. If $p=2$, this is just the sum of the binomial coefficients $ {a \choose i} $ from $0$ to $b$, and there is an asymptotic estimate for this bound (see Exercise $9.42$ from the book R. Graham, D. Knuth, O. Patashnik, Concrete Mathematics).

$\endgroup$
4
  • 1
    $\begingroup$ Asymptotic under what conditions on $p,a,b$? $\endgroup$ Commented May 13 at 12:20
  • $\begingroup$ $p$ and $a/b$ are constant, and $a$ goes to infinity. $\endgroup$ Commented May 13 at 13:26
  • $\begingroup$ Let $X_1, X_2, \dots, X_a$ be iid and drawn uniformly at random from $\{0,1,\dots, p-1\}$. What is the probability that $\sum X_i\le Ca$? In the $a\to \infty$ limit you get a normal distribution, and so theorems like the Berry-Esseen theorem will give you a precise asymptotic. $\endgroup$ Commented May 13 at 13:33
  • 1
    $\begingroup$ Btw, $$a_i = \sum_{k=0}^{\lfloor i/p\rfloor} (-1)^{i+k(p+1)} \binom{a}{k}\binom{-a}{i-pk}.$$ $\endgroup$ Commented May 13 at 15:49

1 Answer 1

0
$\begingroup$

As mentioned in my comment, the situation is equivalent to the following: We have $X_1, \dots, X_a$ drawn iid uniformly from $\{0,1,\dots, p-1\}$, and we wish to estimate $\mathbb{P}(\sum X_i\le Ca)$ for a constant $C$.

Let $Y_a$ be the average of all the $X_i$. The Berry-Esseen theorem tells us with $Z_a(x)$ being the cumulative distribution function of $$\frac{Y_a-a\mu}{\sigma\sqrt{a}}$$ that $$|Z_a(x)-\Phi(x)|\le K\frac{\rho}{\sigma^3\sqrt{a}},$$ where $\mu=\mathbb{E}(X_1)$, $\sigma^2=Var(X_1)$, $\rho=\mathbb{E}(|X_1-\mu|^3)$, and $\Phi(x)$ is the cumulative distribution function of the standard normal, for some constant $K$. All of these quantities are not hard to calculate and the bound we wish for naturally follows.

$\endgroup$
1
  • 3
    $\begingroup$ If $C>\mu$, then the limit of the probability is $1$, by the law of large numbers. If $C=\mu$, then the limit is $1/2$, by the central limit theorem. If $C<\mu$, then the asymptotic of the probability is best described, not by the Berry--Esseen theorem, but by Petrov's extension of the Cramér theorem (see Theorem 6 in Petrov's paper). $\endgroup$ Commented May 13 at 14:17

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.