Rethinking the gold standard with multi-armed bandits: Machine learning allocation algorithms for experiments

Kaibel, Chris ; Biemann, Torsten

Additional URL:
Document Type: Article
Year of publication Online: 2019
The title of a journal, publication series: Organizational Research Methods : ORM
Volume: tba
Page range: 109442811985415
Place of publication: Thousand Oaks, CA
Publishing house: Sage
ISSN: 1094-4281 , 1552-7425
Publication language: English
Institution: Business School > ABWL, Personalmanagement u. Führung (Biemann)
Subject: 330 Economics
Keywords (English): experiments, randomized controlled trial, multi-armed bandit, exploration versus exploitation, machine learning, ethics in research
Abstract: In experiments, researchers commonly allocate subjects randomly and equally to the different treatment conditions before the experiment starts. While this approach is intuitive, it means that new information gathered during the experiment is not utilized until after the experiment has ended. Based on methodological approaches from other scientific disciplines such as computer science and medicine, we suggest machine learning algorithms for subject allocation in experiments. Specifically, we discuss a Bayesian multi-armed bandit algorithm for randomized controlled trials and use Monte Carlo simulations to compare its efficiency with randomized controlled trials that have a fixed and balanced subject allocation. Our findings indicate that a randomized allocation based on Bayesian multi-armed bandits is more efficient and ethical in most settings. We develop recommendations for researchers and discuss the limitations of our approach.

Dieser Eintrag ist Teil der Universitätsbibliographie.

Diese Publikation ist bisher nur Online erschienen. Diese Publikation nun als "Jetzt in Print erschienen" melden.

Metadata export


+ Search Authors in

BASE: Kaibel, Chris ; Biemann, Torsten

Google Scholar: Kaibel, Chris ; Biemann, Torsten

ORCID: Kaibel, Chris ORCID: 0000-0003-2123-9232 ; Biemann, Torsten

+ Page Views

Hits per month over past year

Detailed information

You have found an error? Please let us know about your desired correction here: E-Mail

Actions (login required)

Show item Show item