pith. machine review for the scientific record. sign in

arxiv: 2602.00241 · v2 · submitted 2026-01-30 · 💻 cs.HC · cs.CY

Recognition: unknown

Does Algorithmic Uncertainty Sway Human Experts? Evidence from a Field Experiment in Selective College Admissions

Authors on Pith no claims yet
classification 💻 cs.HC cs.CY
keywords algorithmicmodelsscoreadmissionsarbitraryfavorablehumanpredictions
0
0 comments X
read the original abstract

Algorithmic predictions are inherently uncertain: even models with similar aggregate accuracy can produce different predictions for the same individual, raising concerns that high-stakes decisions may become sensitive to arbitrary modeling choices. In this paper, we define \emph{algorithmic sensitivity} as the extent to which arbitrary modeling choices propagate into human decisions: how much a decision outcome shifts when a more favorable versus less favorable algorithmic prediction is presented to the decision-maker for the same individual. We estimate this in a randomized field experiment ($n=19{,}545$) embedded in a selective U.S. college admissions cycle, in which admissions officers reviewed each application alongside an algorithmic score while we randomly varied whether the score came from one of two similarly accurate prediction models. Although the two models performed similarly in aggregate, they frequently assigned different scores to the same applicant, creating exogenous variation in the score shown. Surprisingly, we find little evidence of algorithmic sensitivity: presenting a more favorable score does not meaningfully increase an applicant's probability of admission on average, even when the models disagree substantially. These findings suggest that, in this expert, high-stakes setting, human decision-making is largely invariant to arbitrary variation in algorithmic predictions, underscoring the role of professional discretion and institutional context in mediating the downstream effects of algorithmic uncertainty.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.