%0 Journal Article %J Mathematics of Operations Research %D Forthcoming %T Two-Armed Restless Bandits with Imperfect Information: Stochastic Control and Indexability %A Roland Fryer %A Philipp Harms %X

 

We present a two-armed bandit model of decision making under uncertainty where the expected return to investing in the "risky arm" increases when choosing that arm and decreases when choosing the "safe" arm. These dynamics are natural in applications such as human capital development, job search, and occupational choice. Using new insights from stochastic control, along with a monotonicity condition on the payo dynamics, we show that optimal strategies in our model are stopping rules that can be characterized by an index which formally coincides with Gittins' index. Our result implies the indexability of a new class of restless bandit models

 

%B Mathematics of Operations Research %G eng