5
$\begingroup$

I aim to solve the following Bellman equation:

\begin{equation} v(\vec{s}) = \min_{\vec{x} \in \Xi_{\vec{s}}} \big\{c(\vec{s}, \vec{x}) + \lambda \times \sum_{\vec{s}^{'}\in S} p(\vec{s}^{'} | \vec{s}, \vec{x}) \times v(\vec{s}^{'})\big\} \hspace{2cm} \forall \vec{s} \in S \end{equation}

I use the linear programming approach to solve the above equation like below:

Model 1: \begin{equation} \max \Big\{\sum_{\vec{s} \in S} \eta(\vec{s}) \times v(\vec{s})\Big\} \end{equation} subjected to: \begin{equation} c(\vec{s}, \vec{x}) + \lambda \times \sum_{\vec{s}^{'} \in S} p(\vec{s}^{'} | \vec{s}, \vec{x}) \times v(\vec{s}^{'}) \ge v(\vec{s}) \hspace{1cm} \forall \vec{s} \in S; \vec{x} \in \Xi_{\vec{s}} \end{equation} \begin{equation} v(\vec{s}) \in \mathbb{R}_{\geq0} \end{equation}

Due to the curse of dimensionality, I use a basis function to approximate the value function and solve the dual of the above problem using a column generation (further detail can be found here: 10.1016/j.ejor.2012.06.046). The column generation algorithm includes a master problem and a subproblem like below:

Model 2, Master problem: where the algorithm iteratively solves the dual problem - the shadow prices of constraints are equal to optimal approximation coefficients.

\begin{equation} \min \sum_{\vec{s} \in S} \sum_{\vec{x} \in \Xi_{\vec{s}}} c(\vec{s}, \vec{x}) \times \chi(\vec{s}, \vec{x}) \end{equation}

subjected to:

\begin{equation} (1 - \lambda) \times \sum_{\vec{s} \in S} \sum_{\vec{x} \in \Xi_{\vec{s}}} \chi(\vec{s}, \vec{x}) = 1 \end{equation} \begin{equation} \sum_{\vec{s} \in S} \sum_{\vec{x} \in \Xi_{\vec{s}}} \mu_{koi}(\vec{s}, \vec{x}) \times \chi(\vec{s}, \vec{x}) \geq \mathbb{E}_{\eta} [p^w_{koi}] \hspace{1cm} \forall k \in \mathcal{K}; o \in \mathcal{O}; i \in \mathcal{I} \end{equation} \begin{equation} \sum_{\vec{s} \in S} \sum_{\vec{x} \in \Xi_{\vec{s}}} \delta_{kti}(\vec{s}, \vec{x}) \times \chi(\vec{s}, \vec{x}) \geq \mathbb{E}_{\eta} [p^m_{kti}] \hspace{1cm} \forall k \in \mathcal{K}; t \in \mathcal{T}; i \in \mathcal{I} \end{equation} \begin{equation} \chi(\vec{s}, \vec{x}) \in \mathbb{R}_{\geq0} \end{equation}

Model 3, Sub-problem: where the algorithm finds feasible state-decision pairs to be added to the master problem. \begin{equation} \max \Big\{(1 - \lambda) \times V^0 + \sum_{k \in \mathcal{K}} \sum_{o \in \mathcal{O}} \sum_{i \in \mathcal{I}} V^w_{koi} \times \mu_{koi}(\vec{s}, \vec{x}) + \sum_{k \in \mathcal{K}} \sum_{t \in \mathcal{T}} \sum_{i \in \mathcal{I}} V^m_{kti} \times \delta_{kti}(\vec{s}, \vec{x}) - c(\vec{s}, \vec{x})\Big\} \end{equation}

subjected to:

\begin{equation} \sum_{p \in \mathcal{P}^w} \sum_{j \in \mathcal{J}} \sum_{t \in \mathcal{T}} \sum_{\substack{b \in \mathcal{B}_c \\ c \in \mathcal{C}}} z_{pjtib} \times R^1_{pk} \times R^1_{po} \leq p^w_{koi} \hspace{1cm} \forall k \in \mathcal{K}; o \in \mathcal{O}; i \in \mathcal{I} \end{equation} \begin{equation} \sum_{p \in \mathcal{P}^m} \sum_{j \in \mathcal{J}} \sum_{\substack{b \in \mathcal{B}_c \\ c \in \mathcal{C}}} z_{pjtib} \times R^1_{pk} = p^m_{kti} \hspace{1cm} \forall k \in \mathcal{K}; t \in \mathcal{T}; i \in \mathcal{I} \end{equation} \begin{equation} (\vec{s}, \vec{x}) \in S \times \Xi_{\vec{s}} \end{equation}

My problem starts from here! To initiate the column generation algorithm, I need to have at least one feasible pair of state-decision for Model 2. As suggested by Saure et al. (2012), I iteratively solve Models 4 and 3 until the objective function of Model 4 becomes equal to zero (meaning that a set of feasible state-decisions is found for Model 2 so I can start the column generation). Starting from an initial hypothesized state, I solve Model 4, finds the shadow prices of constraints and update $V^0$, $V^w_{koi}$ and $V^m_{kti}$, and solve Model 3 (to update $P$ for Model 4). Solving Models 4 and 3 for a few iterations, the objective of Model 4 improves for a while. However, Model 3 stops providing new outcomes for decision variables (generates an identical solution).

What would be the reason for this issue?

Let me know if you need further information.

Model 4:

\begin{equation} \min \Big\{U^0 + \sum_{k \in \mathcal{K}} \sum_{o \in \mathcal{O}} \sum_{i \in \mathcal{I}} U^w_{koi} + \sum_{k \in \mathcal{K}} \sum_{t \in \mathcal{T}} \sum_{i \in \mathcal{I}} U^m_{kti}\Big\} \end{equation}

Subjected to:

\begin{equation} (1 - \lambda) \times \sum_{\vec{s} \in P} \sum_{\vec{x} \in \Xi_{\vec{s}}} \chi(\vec{s}, \vec{x}) + U^0 = 1 \end{equation} \begin{equation} \sum_{\vec{s} \in P} \sum_{\vec{x} \in \Xi_{\vec{s}}} \mu_{koi}(\vec{s}, \vec{x}) \times \chi(\vec{s}, \vec{x}) + U^w_{koi} \geq \mathbb{E}_{\eta} [p^w_{koi}] \hspace{1cm} \forall k \in \mathcal{K}; o \in \mathcal{O}; i \in \mathcal{I} \end{equation} \begin{equation} \sum_{\vec{s} \in P} \sum_{\vec{x} \in \Xi_{\vec{s}}} \delta_{kti}(\vec{s}, \vec{x}) \times \chi(\vec{s}, \vec{x}) + U^m_{kti} \geq \mathbb{E}_{\eta} [p^m_{kti}] \hspace{1cm} \forall k \in \mathcal{K}; t \in \mathcal{T}; i \in \mathcal{I} \end{equation} \begin{equation} U^0, U^w_{koi}, U^m_{kti} \in \mathbb{R}_{\geq0} \end{equation}

$\endgroup$

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.