pith. machine review for the scientific record. sign in

arxiv: 2603.24995 · v2 · submitted 2026-03-26 · 💻 cs.HC

Recognition: no theorem link

Framing Data Choices: How Pre-Donation Exploration Designs Influence Data Donation Behavior and Decision-Making

Authors on Pith no claims yet

Pith reviewed 2026-05-15 01:06 UTC · model grok-4.3

classification 💻 cs.HC
keywords data donationchoice framingpre-donation explorationuser behaviorpublic sector researchprivacy concernsparticipation ratesdecision-making
0
0 comments X

The pith

Social comparison framing in pre-donation data exploration increases donation participation rates compared to self-focused or collective-only approaches.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This paper examines how the presentation of data choices during the pre-donation exploration stage shapes whether people actually donate their data for public sector research. It tested three framing interventions in a real-world study with 24 participants. The social comparison design, which highlights how one's data relates to others, produced an 87.5 percent donation rate. The self-focused view reached 62.5 percent, while the collective-only frame fell to 37.5 percent and triggered perspective confusion plus privacy concerns. These results indicate that design choices at the exploration stage can narrow the gap between willingness to donate and actual behavior.

Core claim

Through a real-world data donation study (N=24), the authors found that choice framing impacts donation participation. The social comparison design (87.5%) outperformed the self-focused view (62.5%) while a collective-only frame (37.5%) backfired, causing perspective confusion and privacy concerns. This study demonstrates how strategic data framing addresses data donation as a behavioral challenge, revealing design's critical yet underexplored role in data donation for participatory public sector innovation.

What carries the argument

Pre-donation data exploration interventions framed as self-focused, social comparison, or collective-only views that shape user decision-making and participation.

If this is right

  • Designers of public data donation platforms should prioritize social comparison elements during exploration to raise participation.
  • Collective-only framing risks lowering donation rates and should be tested or avoided to prevent privacy concerns.
  • Pre-donation exploration serves as a key leverage point for closing the gap between stated willingness and actual data donation.
  • Behavioral framing techniques can be applied to improve informed consent and participation in user-centric data collection.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • These framing patterns could extend to other voluntary data-sharing contexts such as health records or citizen science projects.
  • Future work with larger samples might identify which specific elements of social comparison drive the effect.
  • Interfaces could combine social comparison with safeguards against privacy backlash to maintain high donation rates.
  • The perspective confusion observed suggests testing hybrid frames that balance individual and group views.

Load-bearing premise

Observed differences in donation rates stem directly from the framing interventions rather than from individual participant differences or the specific study context.

What would settle it

A larger randomized trial with statistical controls that shows no reliable difference in donation rates across the three framing conditions would falsify the central claim.

read the original abstract

Data donation, an emerging user-centric data collection method for public sector research, faces a gap between participant willingness and actual donation. This suggests a design absence in practice: while promoted as "donor-centered" with technical and regulational advances, a design perspective on how data choices are presented and intervene on individual behaviors remain underexplored. In this paper, we focus on pre-donation data exploration, a key stage for adequately and meaningful informed participation. Through a real-world data donation study (N=24), we evaluated three data exploration interventions (self-focused, social comparison, collective-only). Findings show choice framing impacts donation participation. The "social comparison" design (87.5%) outperformed the "self-focused view" (62.5%) while a "collective-only" frame (37.5%) backfired, causing "perspective confusion" and privacy concerns. This study demonstrates how strategic data framing addresses data donation as a behavioral challenge, revealing design's critical yet underexplored role in data donation for participatory public sector innovation.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The manuscript reports results from a real-world data donation study (N=24) that tested three pre-donation exploration designs: self-focused view, social comparison, and collective-only. It claims that framing choices in the exploration interface causally affects actual donation rates, with social comparison producing the highest participation (87.5%), self-focused intermediate (62.5%), and collective-only the lowest (37.5%), the latter also linked to reported perspective confusion and privacy concerns.

Significance. If the differences prove robust, the work would provide concrete evidence that interface framing can close the gap between stated willingness and actual data donation, offering actionable design guidance for public-sector participatory data collection. The study is notable for being conducted in a realistic setting rather than a lab vignette.

major comments (2)
  1. The central claim attributes the observed donation rates (87.5%, 62.5%, 37.5%) directly to the three framing interventions. With only eight participants per arm and no reported statistical tests, confidence intervals, randomization checks, or power analysis, these percentages remain consistent with binomial sampling variation around a common underlying rate; the manuscript therefore lacks evidence that rules out chance or pre-existing participant differences as the source of the differences.
  2. The methods description provides no details on how participants were assigned to conditions, whether assignment was randomized, or what (if any) covariates or individual-difference measures were collected and controlled for. This omission is load-bearing because the small N makes the design vulnerable to selection confounds.
minor comments (1)
  1. The abstract states the percentages but does not mention the per-condition sample size or the absence of inferential statistics; adding these would improve transparency.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for highlighting the need for greater statistical rigor and methodological transparency. We agree these elements are essential for supporting the causal claims and will revise the manuscript accordingly to include formal tests, confidence intervals, and expanded methods details.

read point-by-point responses
  1. Referee: The central claim attributes the observed donation rates (87.5%, 62.5%, 37.5%) directly to the three framing interventions. With only eight participants per arm and no reported statistical tests, confidence intervals, randomization checks, or power analysis, these percentages remain consistent with binomial sampling variation around a common underlying rate; the manuscript therefore lacks evidence that rules out chance or pre-existing participant differences as the source of the differences.

    Authors: We acknowledge that the small per-condition sample (n=8) means the observed differences (7/8, 5/8, 3/8) could plausibly arise from sampling variation, and the original manuscript did not include statistical tests or power analysis. In the revision we will add a Fisher's exact test comparing donation proportions across the three conditions, report exact 95% binomial confidence intervals for each rate, and include a post-hoc power calculation. We will also frame the study explicitly as exploratory and discuss the need for larger-scale replication to confirm robustness. revision: yes

  2. Referee: The methods description provides no details on how participants were assigned to conditions, whether assignment was randomized, or what (if any) covariates or individual-difference measures were collected and controlled for. This omission is load-bearing because the small N makes the design vulnerable to selection confounds.

    Authors: We will expand the Methods section to state that participants were randomly assigned to conditions using a simple randomization procedure executed by the recruitment platform at the time of enrollment. No individual-difference covariates or psychological measures were collected beyond basic demographics (age, gender, education), which we will report and note as a limitation. We will also add a table or text confirming demographic balance across arms to allow readers to evaluate potential confounds. revision: yes

Circularity Check

0 steps flagged

No circularity: empirical observations from N=24 study

full rationale

The paper reports direct behavioral results from a small real-world data donation experiment (N=24, 8 per arm) comparing three framing interventions. Donation rates (87.5%, 62.5%, 37.5%) and qualitative notes on confusion/privacy are presented as observed outcomes without any equations, fitted parameters, predictive models, or self-citations that reduce claims to inputs by construction. No derivation chain exists that could be circular; the central claim rests on the reported participation differences and participant feedback rather than any self-referential logic or renamed known result.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

This is a small-scale empirical behavioral study. It relies on standard assumptions about participant responses in controlled settings but introduces no new mathematical parameters, axioms, or postulated entities.

axioms (1)
  • domain assumption Participant donation decisions in the study reflect genuine responses to the framing rather than demand characteristics or external influences
    The interpretation of the percentage differences depends on this assumption about the validity of the behavioral measure.

pith-pipeline@v0.9.0 · 5484 in / 1296 out tokens · 37591 ms · 2026-05-15T01:06:16.659426+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

15 extracted references · 15 canonical work pages

  1. [1]

    Introduction Data donation, the voluntary transfer of personal data for research or public benefit, has emerged as a promising method for participatory public sector innovation. By granting individuals control over what they share, data donation enables access to granular, private, and retrospective personal data while respecting user autonomy (Ohme et al...

  2. [2]

    adequately

    Background and Related Work 2.1 Data Donation as an Emerging Public Sector Method Public sector practitioners increasingly use design to shape policies and services, moving beyond bureaucratic traditions toward human-centered administration (Bason & Austin, 2022; Mortati et al., 2022). As governments face “wicked” and complex challenges, there is a growin...

  3. [3]

    default design

    Method To answer this research question, we conducted a user study comparing three data exploration interventions in a real data donation practice. Following a Research through Design (RtD) approach (Zimmerman et al., 2007), we designed and developed a data exploration-donation platform as the primary vehicle for knowledge generation; guided by the establ...

  4. [4]

    They were explicitly informed no data would be collected by default

    Consent to Exploration: Participants consented to participate in the data exploration study and allow temporary, local processing of their calendar data. They were explicitly informed no data would be collected by default. Introduction materials (IRB overview, donation context, platform information) were presented through printed documents and researcher’...

  5. [5]

    Pre-Study Survey: Participants completed a survey evaluating their prior understanding of both data and donation projects based on traditional informed consent materials (from Step 1), before any data exploration

  6. [6]

    Data was processed locally in their browser to generate their assigned intervention (A, B, or C)

    Data Extraction: Participants logged into our platform with their institutional Google Calendar credentials. Data was processed locally in their browser to generate their assigned intervention (A, B, or C)

  7. [7]

    Researchers could not view participants' screens to ensure privacy

    Data Exploration: Participants explored their data through four interactive visualization archetypes while providing think-aloud commentary. Researchers could not view participants' screens to ensure privacy

  8. [8]

    Donate My Data

    Donation Decision: Participants scrolled to the bottom of the platform after exploration to make their donation choice:"Donate My Data" or "Do Not Donate."

  9. [9]

    Those who donated verified their donated data package was properly anonymized

    Verification: Participants logged out and verified that the platform retained no data. Those who donated verified their donated data package was properly anonymized

  10. [10]

    Figure 3 Overview of in-person data donation procedure and environmental setting of the user study

    Post-Study Survey & Follow-up Interview: Participants completed a post-study survey (mirroring pre-study questions) and a semi-structured interview discussing their experience. Figure 3 Overview of in-person data donation procedure and environmental setting of the user study. 3.5 Participants We recruited 30 student participants from the Institute of Desi...

  11. [11]

    I do not think any visualization was particularly impactful as much as the exercise as a whole

    Results and Findings 4.1 Behavior: Donation Results Group B (individual-collective exploration) showed the highest donation rate at 87.5% (7/8 participants), followed by Group A (individual-only exploration) at 62.5% (5/8), and Group C (collective-only exploration) at 37.5% (3/8), seen in Figure 4. This 50-percentage-point difference between Group B and C...

  12. [12]

    Adequately

    Discussion The results of our study reveal that the design of the pre-donation data exploration is not a neutral feature but a critical, high-impact intervention. Our findings present a clear puzzle: while the modality of presentation (visualization vs. text) and the specific visualization archetype (heatmap, dot plot, network diagram) had no discernible ...

  13. [13]

    social comparison

    Limitations and Future Work Our findings are exploratory, limited by a small (N=24), single-institution sample, and a low-sensitivity data type (Google Calendar). The "social comparison" frame's success (Group B) may not generalize and requires re-testing with larger, diverse populations and high-stakes data, where these framing approaches could plausibly...

  14. [14]

    social framing

    Conclusion This study makes three primary contributions to the fields of public sector design and data donation: First, we provide empirical evidence that pre-donation data exploration is a critical, high-impact intervention. Behavioral effects are already embedded in commonly used "default" designs (Group A), while alternative designs meeting the same in...

  15. [15]

    positive friction

    References Alhadad, S. (2018). Visualizing data to support judgement, inference, and decision making in learning analytics: Insights from cognitive psychology and visualization science. Journal of Learning Analytics, 5, 60–85. https://doi.org/10.18608/jla.2018.52.5 Allcott, H. (2011). Social norms and energy conservation. Journal of Public Economics, Spec...