Publications by Year: 2003

2003
Burnetas, A.N. & Katehakis, M.N., 2003. Asymptotic bayes analysis for the finite-horizon one-armed-bandit problem. Probability in the Engineering and Informational Sciences, 17, pp.53-82. Website
Ballou, R.H. & Burnetas, A., 2003. PLANNING MULTIPLE LOCATION INVENTORIES. Journal of Business Logistics, 24, pp.65-89. Website
Burnetas, A.N. & Katehakis, M.N., 2003. Asymptotic bayes analysis for the finite-horizon one-armed-bandit problem. Probability in the Engineering and Informational Sciences, 17, pp.53-82. Website Abstract
The multiarmed-bandit problem is often taken as a basic model for the trade-off between the exploration and utilization required for efficient optimization under uncertainty. In this article, we study the situation in which the unknown performance of a new bandit is to be evaluated and compared with that of a known one over a finite horizon. We assume that the bandits represent random variables with distributions from the one-parameter exponential family. When the objective is to maximize the Bayes expected sum of outcomes over a finite horizon, it is shown that optimal policies tend to simple limits when the length of the horizon is large.