Consider the following random variable $$ Z=\min_i\{X_i+Y_i\} $$ for $-n\leq i\leq n$, where $X_i\overset{\mathrm{iid}}{\sim}\text{Exp}(\lambda)$, $Y_i\overset{\mathrm{iid}}{\sim}\text{Erlang}(|i|,\gamma)$, and $X_i \perp Y_i,\forall i$.
One can think of $Z$ as the time of first arrival (at $i=0$) of incoming waves starting from different locations ($i$). For each $i$, we assume it takes time $X_i$ to initiate a wave at point $i$ and time $Y_i$ to for the wave to reach $i=0$. $Z$ is the minimum of all possible times.
What can we say about $\mathbb{E}[Z]$?
Some ideas: In the case where we take $Y_i$ as its expected value, that is, $\mathbb{E}[Y_i]=|i|/\gamma$, we may actually compute a closed formula for $\mathbb{E}[Z]$, assuming non-negative time, $$\tag{1} \mathbb{E}[Z]=\mathbb{E}[\min_i\{X_i+|i|/\gamma\}]=\int_0^\infty \prod_i \min \{ 1,\exp(-\lambda(t-|i|/\gamma)) \} \,dt $$ which can then be split into a series of different integrals, and where the product is taken over all $i$. For full derivation details, see this answer in the context of an activation process.
If this "wave arrival time", $Y_i$, follows an Erlang distribution, the problem becomes slightly more complex. However, averaged simulations of the activation process under these distribution assumption say otherwise (that is, expression (1) is a good enough approximation). So there must be not much difference between considering the pure random variables, or its expected value. I would like to show this mathematically, either by deducing a similar expression if I take an Erlang-distributed random variable, or by other means. Any ideas?
Further insights: From this paper, we have the following result
Corollary 6.2. Let $Z = Y_1 + \cdots + Y_k$ be the sum of $k$ independent random variables having the Erlang distribution with parameter $(m_j, \gamma_j)$ for $j = 1, \dots, k$, and $\gamma_j \neq \gamma_p$ for $j \neq p$. Then the density for $Z$, $f_Z(t)$, for $t \geq 0$ is given by $$ f_Z(t) = \prod_{j=1}^k \gamma_j^{m_j} \left\{ \sum_{j=1}^k e^{-\gamma_j t} (-1)^{m_j-1} \times \sum_{\sum_{p=1}^k r_p = m_j-1, \, r_p \geq 0} \frac{(-t)^{r_j}}{r_j!} \prod_{q=1, q \neq j}^k \frac{(m_q + r_q - 1)!}{(\gamma_q - \gamma_j)^{m_q + r_q}} \right\}. $$ In our case, let $Z_i\equiv X_i+Y_i$ and set $k=2$, $(m_1,\gamma_1)=(1,\lambda)$, and $(m_2,\gamma_2)=(|i|,\gamma)$, the CDF $F_{Z_i}(t)$ is given by $$ F_{Z_i}(t) = 1 - e^{-\lambda t} + \lambda \gamma^{|i|} \sum_{r_2=0}^{|i|-1} \frac{r_1!}{(\lambda - \gamma)^{1 + r_1}} \left( \frac{\Gamma(r_2+1,0) - \Gamma(r_2+1, \gamma t)}{\gamma^{r_2+1}} \right) $$ where $\Gamma$ is the incomplete Gamma function, given by $$ \Gamma(s, z) = \int_z^\infty x^{s-1} e^{-x} \, dx, \quad \text{for } s > 0 \text{ and } z \geq 0. $$ From this question, we know that if the CDF of $Z_i$ is denoted by $F_{Z_i}(t)$, then the CDF of the minimum $Z$ is given by $F_Z(t)=1-[1-F_{Z_i}(t)]^{2n+1}$. Hence, $$ \begin{align} \mathbb{E}[Z]&=\int_0^\infty (1-F_Z(t))\,dt\\ &= \int_0^\infty \left( e^{-\lambda t} - \lambda \gamma^{|i|} \sum_{r_2=0}^{|i|-1} \frac{r_1!}{(\lambda - \gamma)^{1 + r_1}} \left( \frac{\Gamma(r_2+1, 0) - \Gamma(r_2+1, \gamma t)}{\gamma^{r_2+1}} \right) \right)^{2n+1} \, dt. \end{align} $$ I wonder if there is a simplification somewhere.