# Context Attentive Bandits: Contextual Bandit with Restricted Context

10 May 2017Djallel BouneffoufIrina RishGuillermo A. CecchiRaphael Feraud

We consider a novel formulation of the multi-armed bandit model, which we call the contextual bandit with restricted context, where only a limited number of features can be accessed by the learner at every iteration. This novel formulation is motivated by different online problems arising in clinical trials, recommender systems and attention modeling... (read more)

PDF Abstract