Here is a basic state space model (state equation $x_t$, observation equation $y_t$ and initial distribution of state):
$$x_t = x_{t-1} + w_t, \quad w_t \sim N(0, \sigma^2_w)$$ $$y_t = x_t + v_t, \quad v_t \sim N(0, \sigma^2_v)$$ $$x_0 \sim N(m_0, \sigma^2_0)$$
I want to add random effects to this model:
$$y_{jt} = \mu + \alpha_j + x_{jt} + v_{jt}, \quad v_{jt} \sim N(0, \sigma^2_v)$$ $$x_{jt} = x_{j,t-1} + w_{jt}, \quad w_{jt} \sim N(0, \sigma^2_w)$$ $$\alpha_j \sim N(0, \sigma^2_\alpha)$$ $$x_{j0} \sim N(m_0, \sigma^2_0)$$
I am finding difficulty trying to find references which show if this is possible and how to do this. I am trying to derive it myself using a general approach as if this was a linear mixed effects regression.
For parameters $\theta = (\mu, \sigma^2_w, \sigma^2_v, \sigma^2_\alpha, m_0, \sigma^2_0)$, the full density should be:
$$p(\mathbf{y}, \mathbf{x}, \boldsymbol{\alpha} | \theta) = p(\mathbf{y}|\mathbf{x}, \boldsymbol{\alpha}, \theta) \cdot p(\mathbf{x}|\theta) \cdot p(\boldsymbol{\alpha}|\theta)$$
For the observation equation, since $y_{jt} | x_{jt}, \alpha_j \sim N(\mu + \alpha_j + x_{jt}, \sigma^2_v)$ the likelihood is:
$$p(\mathbf{y}|\mathbf{x}, \boldsymbol{\alpha}, \theta) = (2\pi \sigma^2_v)^{-JT/2} \exp\left\{-\frac{1}{2\sigma^2_v}\sum_{j=1}^{J}\sum_{t=1}^{T}(y_{jt} - \mu - \alpha_j - x_{jt})^2\right\}$$
For the state equation, since $x_{jt} | x_{j,t-1} \sim N(x_{j,t-1}, \sigma^2_w)$ and $x_{j0} \sim N(m_0, \sigma^2_0)$:
$$p(\mathbf{x}|\theta) = (2\pi \sigma^2_0)^{-J/2} \exp\left\{-\frac{1}{2\sigma^2_0}\sum_{j=1}^{J}(x_{j0} - m_0)^2\right\} \times$$ $$(2\pi \sigma^2_w)^{-JT/2} \exp\left\{-\frac{1}{2\sigma^2_w}\sum_{j=1}^{J}\sum_{t=1}^{T}(x_{jt} - x_{j,t-1})^2\right\}$$
The random effects likelihood is:
$$p(\boldsymbol{\alpha}|\theta) = (2\pi\sigma^2_\alpha)^{-J/2} \exp\left\{-\frac{1}{2\sigma^2_\alpha}\sum_{j=1}^{J}\alpha_j^2\right\}$$
Combining everything together:
$$p(\mathbf{y}, \mathbf{x}, \boldsymbol{\alpha} | \theta) = (2\pi)^{-J(2T+1)/2} (\sigma^2_v)^{-JT/2} (\sigma^2_w)^{-JT/2} (\sigma^2_0)^{-J/2} (\sigma^2_\alpha)^{-J/2} \times$$ $$\exp\left\{-\frac{1}{2}\left[\frac{1}{\sigma^2_v}\sum_{j=1}^{J}\sum_{t=1}^{T}(y_{jt} - \mu - \alpha_j - x_{jt})^2 + \frac{1}{\sigma^2_w}\sum_{j=1}^{J}\sum_{t=1}^{T}(x_{jt} - x_{j,t-1})^2\right.\right.$$ $$\left.\left.+ \frac{1}{\sigma^2_0}\sum_{j=1}^{J}(x_{j0} - m_0)^2 + \frac{1}{\sigma^2_\alpha}\sum_{j=1}^{J}\alpha_j^2\right]\right\}$$
The marginal likelihood requires integration (I haven't worked everything out yet, but I think it might be normal):
$$p(\mathbf{y}|\theta) = \int \int p(\mathbf{y}, \mathbf{x}, \boldsymbol{\alpha} | \theta) \, d\mathbf{x} \, d\boldsymbol{\alpha}$$
From here, the plan is to estimate the posterior distributions of all parameters using Bayesian. After placing priors on all parameters, I would use this likelihood along with some version of MCMC to estimate posteriors of all parameters (and completely bypass the Kalman Filter) .
Is this a statistically valid estimation process for adding random effects to a model like this?