Thisvolume provides a general overview ofdiscrete- and continuous-time Markov control processes and stochastic games, along witha look atthe range of applications of stochastic control and some of its recent theoretical developments. These topics include various aspects of dynamic programming, approximation algorithms,and infinite-dimensional linear programming. In all, the work comprises 18 carefully selected papers written by experts in their respective fields. Optimization, Control, and Applications of Stochastic Systems will be a valuable resource for all practitioners, researchers, and professionals in applied mathematics and operations research who work in the areas of stochastic control, mathematical finance, queueing theory, and inventory systems. Itmay also serve as a supplemental text for graduate courses in optimal control and dynamic games.