Stochastic modeling for inventory and production planning in the paper industry



Problem formulations and solution procedures of production planning and inventory management for manufacturing systems under uncertainties is discussed. Markov decision processes and controlled Markovian dynamic systems are used in the models. Considering an inventory problem in discrete time and formulating it by a finite-state Markov chain lead to a Markov decision process model. Using the policy-improvement algorithm yields the optimal inventory policy. In controlled dynamic system modeling, the random demand and capacity processes involved in planning are described by two finite-state continuous-time Markov chains. Such an approach enables us to embed the randomness in the differential equations of the system. The optimal production rates that minimize an expected cost are obtained by numerically solving the corresponding Hamilton–Jacobi–Bellman (HJB) equations. To overcome the so-called curse of dimensionality, frequently encountered in computation, we resort to a hierarchical approach. Illustrative examples using data collected from a large paper manufacturer are provided. © 2004 American Institute of Chemical Engineers AIChE J, 50: 2877–2890, 2004