pith. machine review for the scientific record. sign in

arxiv: 2605.11825 · v1 · submitted 2026-05-12 · 💻 cs.RO

Recognition: no theorem link

Mapping Embodied Affective Touch Strategies on a Humanoid Robot

Alessandra Sciutti, Francesca Cocchella, Omar Eldardeer, Qiaoqiao Ren, Rea Francesco, Tony Belpaeme

Authors on Pith no claims yet

Pith reviewed 2026-05-13 05:27 UTC · model grok-4.3

classification 💻 cs.RO
keywords touchaffectivebodyparticipantsrobotconstraintsfreeshaped
0
0 comments X

The pith

Affective touch expression on humanoid robots depends on body region and embodiment constraints.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The study investigates how humans express eight different emotions through touch on a humanoid robot equipped with full-body tactile sensors. Participants performed touches under free, arm-only, and torso-only conditions to reveal how physical access and constraints influence where and how they touch. This approach addresses the limitation of prior work that examined only isolated body parts, providing a more complete picture of affective touch across the robot's body. Findings indicate that preferred touch areas, motion, and pressure patterns vary by region and condition, with strategies not transferring directly between free and restricted access, and many users reporting increased closeness afterward.

Core claim

Body region and spatial constraints jointly shaped both touch location and dynamics when participants expressed emotions on the iCub. In free touch, participants preferred socially accessible upper-body regions, while less frequently touched areas showed stronger emotion-specific selectivity. Emotion-related variation was more evident in motion features for arm-only touch and pressure features for torso-only touch. Touch strategies also did not transfer directly between free and constrained conditions, even within the same coarse body region. Participants reported increased closeness to the robot after interaction, with around 30 percent reporting a change in perceived social relationship.

What carries the argument

The within-subjects experimental design comparing free full-body touch against arm-only and torso-only constrained conditions on a robot with distributed tactile sensors to map how embodiment affects affective touch expression.

Load-bearing premise

The differences in touch location and dynamics between free and constrained conditions are caused by embodiment effects and not by the order of conditions, fatigue, or participants guessing the study's purpose.

What would settle it

If a follow-up experiment using separate groups for each touch condition (between-subjects) finds no significant differences in touch strategies or emotion selectivity by region, this would challenge the embodiment constraint explanation.

Figures

Figures reproduced from arXiv: 2605.11825 by Alessandra Sciutti, Francesca Cocchella, Omar Eldardeer, Qiaoqiao Ren, Rea Francesco, Tony Belpaeme.

Figure 1
Figure 1. Figure 1: The iCub humanoid robot used in the study, equipped [PITH_FULL_IMAGE:figures/full_fig_p004_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Experiment setup showing the front and side views of the participant, camera placements, and the overall configuration [PITH_FULL_IMAGE:figures/full_fig_p005_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: The participant expressing “love” to the iCub robot. [PITH_FULL_IMAGE:figures/full_fig_p005_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Body acceptance of affective touch TABLE I: Chi-square test results for emotion effects across body regions. Body Region χ2(7) p-value Right hand 85.68 < .001 Left hand 67.14 < .001 Torso 53.38 < .001 Left upper arm 42.86 < .001 Face & head 42.86 < .001 Right upper arm 24.22 = .001 Left exposed joint 16.54 = .021 Back 62.90 < .001 Left upper leg 40.78 < .001 Right upper leg 34.93 < .001 Right lower leg 25.… view at source ↗
Figure 5
Figure 5. Figure 5: Body part distributions and normalized emotion body part associations in the free touch condition. [PITH_FULL_IMAGE:figures/full_fig_p008_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Condition × emotion interaction profiles for the six pressure features, all of which remained significant after correction. The lines show condition-specific mean feature values across emotions for arm-only and torso-only trials [PITH_FULL_IMAGE:figures/full_fig_p010_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Feature distributions across the three experimental conditions. These plots illustrate how several pressure and motion [PITH_FULL_IMAGE:figures/full_fig_p010_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Raincloud-style plots for before and after interaction self report rating changes. Each panel combines a half violin [PITH_FULL_IMAGE:figures/full_fig_p012_8.png] view at source ↗
Figure 9
Figure 9. Figure 9: Post-interaction comfort scores split by social-role attribution. Each panel shows a half violin, a narrow boxplot, and [PITH_FULL_IMAGE:figures/full_fig_p012_9.png] view at source ↗
read the original abstract

Affective touch in human-robot interaction is shaped not only by emotional intent, but also by robot embodiment, including touch location, physical constraints, and perceived agency or social role. Existing HRI studies typically focus on one or two isolated body parts, limiting understanding of how affective touch generalises across the full humanoid body. We present a study with 32 participants interacting with the iCub robot, which is equipped with full-body distributed tactile sensors. Participants expressed eight emotions under three conditions: free touch, arm-only touch, and torso-only touch. Results show that body region and spatial constraints jointly shaped both touch location and dynamics. In free touch, participants preferred socially accessible upper-body regions, while less frequently touched areas showed stronger emotion-specific selectivity. Emotion-related variation was more evident in motion features for arm-only touch and pressure features for torso-only touch. Touch strategies also did not transfer directly between free and constrained conditions, even within the same coarse body region. Participants reported increased closeness to the robot after interaction, with around 30 percent reporting a change in perceived social relationship. Together, these findings show that affective touch expression is strongly body-region dependent and shaped by embodiment constraints.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 1 minor

Summary. The manuscript reports an empirical study with 32 participants who expressed eight emotions via touch on the iCub humanoid robot (equipped with full-body tactile sensors) under three within-subjects conditions: free touch, arm-only touch, and torso-only touch. Key findings include body-region preferences (upper-body in free touch), stronger emotion-specific selectivity in less-accessed areas, differential reliance on motion features (arm-only) versus pressure features (torso-only), non-transfer of strategies across conditions, and increased perceived closeness to the robot post-interaction (with ~30% reporting changed social relationship). The central claim is that affective touch expression is strongly body-region dependent and shaped by embodiment constraints.

Significance. If the results hold after addressing design and analysis issues, the work offers a valuable full-body empirical mapping of affective touch strategies in HRI, extending prior studies limited to isolated body parts. The distributed tactile sensing setup is a clear strength, enabling fine-grained analysis of location and dynamics. This could inform robot design for more naturalistic emotional touch interactions, emphasizing embodiment's role in social touch.

major comments (3)
  1. [Methods] Methods section (experimental procedure): The within-subjects design has each of the 32 participants perform all three conditions (free, arm-only, torso-only) while expressing the same eight emotions, yet the description provides no indication that condition order was randomized or counterbalanced, nor any modeling of order, session position, or carry-over effects (e.g., via mixed-effects terms). This directly threatens the central claim, as differences in touch location, dynamics, and feature reliance could reflect sequence, fatigue, or demand characteristics rather than embodiment constraints.
  2. [Results] Results section: Directional claims about body-region dependence, emotion selectivity, and differential motion/pressure feature use are presented without statistical tests, effect sizes, participant demographics, exact feature definitions, or correction for multiple comparisons. Post-hoc region selectivity assertions appear to rest on unshown data partitions, undermining support for the abstract's conclusion that touch is 'strongly body-region dependent.'
  3. [Discussion] Discussion section: The interpretation that non-transfer of strategies and feature differences primarily reflect embodiment effects (rather than the within-subjects confounds noted above) lacks supporting analyses or explicit discussion of alternative explanations, making the load-bearing claim about embodiment constraints vulnerable.
minor comments (1)
  1. [Abstract] Abstract: The phrasing 'around 30 percent' for participants reporting changed social relationship would be more precise with the exact value and any associated statistical support.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for their constructive and detailed comments. We address each major comment below and have revised the manuscript to incorporate additional methodological details, statistical analyses, and expanded discussion as appropriate.

read point-by-point responses
  1. Referee: [Methods] Methods section (experimental procedure): The within-subjects design has each of the 32 participants perform all three conditions (free, arm-only, torso-only) while expressing the same eight emotions, yet the description provides no indication that condition order was randomized or counterbalanced, nor any modeling of order, session position, or carry-over effects (e.g., via mixed-effects terms). This directly threatens the central claim, as differences in touch location, dynamics, and feature reliance could reflect sequence, fatigue, or demand characteristics rather than embodiment constraints.

    Authors: We appreciate the referee highlighting this omission. The condition order was counterbalanced across participants using a balanced Latin square design, but this detail was not stated in the original manuscript. We have revised the Methods section to describe the counterbalancing procedure explicitly. We have also conducted additional mixed-effects modeling with order and session position as factors (participant as random effect) and found no significant effects on touch location preferences or feature reliance (all p > 0.05). These results will be reported in the revised Results section and support that the differences arise from embodiment constraints. revision: yes

  2. Referee: [Results] Results section: Directional claims about body-region dependence, emotion selectivity, and differential motion/pressure feature use are presented without statistical tests, effect sizes, participant demographics, exact feature definitions, or correction for multiple comparisons. Post-hoc region selectivity assertions appear to rest on unshown data partitions, undermining support for the abstract's conclusion that touch is 'strongly body-region dependent.'

    Authors: We agree that the Results section requires more rigorous statistical reporting to substantiate the claims. Participant demographics are already provided in the Methods section; we will add explicit cross-references. Feature definitions (motion velocity, pressure variance, etc.) are in the supplementary materials and will now be summarized in the main text. We have performed repeated-measures ANOVAs for region preferences and feature differences, with post-hoc Tukey tests and Bonferroni correction, and report effect sizes (partial eta-squared). The region selectivity analyses use the full dataset partitioned by condition and emotion; we will include the statistical tables and supporting figures in the revision. revision: yes

  3. Referee: [Discussion] Discussion section: The interpretation that non-transfer of strategies and feature differences primarily reflect embodiment effects (rather than the within-subjects confounds noted above) lacks supporting analyses or explicit discussion of alternative explanations, making the load-bearing claim about embodiment constraints vulnerable.

    Authors: We have expanded the Discussion to address alternative explanations directly. We now discuss potential confounds including order effects, fatigue, and demand characteristics, while referencing the new mixed-effects analyses showing no significant order effects. We also highlight that non-transfer of strategies was observed even in within-region comparisons (e.g., arm in free touch vs. arm-only condition), which helps isolate embodiment effects. The revised text presents a more balanced view while maintaining that the within-subjects design and region-specific patterns support the role of embodiment constraints. revision: yes

Circularity Check

0 steps flagged

No circularity: purely empirical behavioral study

full rationale

The paper reports results from a within-subjects human-robot interaction experiment with 32 participants expressing emotions via touch on an iCub robot under free, arm-only, and torso-only conditions. All claims rest on direct observation of touch locations, motion/pressure features, emotion selectivity, and post-interaction questionnaires. No equations, models, fitted parameters, predictions, or derivation chains appear in the abstract or described content. Central findings (body-region dependence, non-transfer of strategies) are presented as empirical patterns without reduction to self-defined inputs or self-citation load-bearing steps. The study is self-contained against external benchmarks of behavioral data collection and analysis.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

Empirical user study; no mathematical derivations or new physical entities. Relies on standard domain assumptions about human ability to convey affect via touch.

axioms (1)
  • domain assumption Human participants can express distinct emotions through intentional touch on a robot's body
    Core premise of the eight-emotion task design; invoked implicitly in the study protocol.

pith-pipeline@v0.9.0 · 5518 in / 1199 out tokens · 111309 ms · 2026-05-13T05:27:33.488561+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

63 extracted references · 63 canonical work pages

  1. [1]

    The science of interpersonal touch: an overview,

    A. Gallace and C. Spence, “The science of interpersonal touch: an overview,”Neuroscience & Biobehavioral Reviews, vol. 34, no. 2, pp. 246–259, 2010

  2. [2]

    Touch technology in affective human–, robot–, and virtual–human interactions: A survey,

    T. Olugbade, L. He, P. Maiolino, D. Heylen, and N. Bianchi-Berthouze, “Touch technology in affective human–, robot–, and virtual–human interactions: A survey,”Proceedings of the IEEE, vol. 111, no. 10, pp. 1333–1354, 2023

  3. [3]

    The why, who and how of social touch,

    J. T. Suvilehto, A. Cekaite, and I. Morrison, “The why, who and how of social touch,”Nature Reviews Psychology, vol. 2, no. 10, pp. 606–621, 2023

  4. [4]

    Affective touch and regulation of stress responses,

    T. Kidd, S. L. Devine, and S. C. Walker, “Affective touch and regulation of stress responses,”Health psychology review, vol. 17, no. 1, pp. 60–77, 2023

  5. [5]

    Touch and the body,

    A. Serino and P. Haggard, “Touch and the body,”Neuroscience & Biobehavioral Reviews, vol. 34, no. 2, pp. 224–236, 2010

  6. [6]

    Embodied simulation and touch: The sense of touch in social cognition,

    V . Gallese and S. Ebisch, “Embodied simulation and touch: The sense of touch in social cognition,”Phenomenology and Mind, no. 4, pp. 196– 210, 2013

  7. [7]

    A survey of tactile human–robot inter- actions,

    B. D. Argall and A. G. Billard, “A survey of tactile human–robot inter- actions,”Robotics and autonomous systems, vol. 58, no. 10, pp. 1159– 1176, 2010

  8. [8]

    Social robots for education: A review,

    T. Belpaeme, J. Kennedy, A. Ramachandran, B. Scassellati, and F. Tanaka, “Social robots for education: A review,”Science robotics, vol. 3, no. 21, p. eaat5954, 2018

  9. [9]

    Integrating socially assistive robotics into mental healthcare interventions: Applications and recommendations for expanded use,

    S. M. Rabbitt, A. E. Kazdin, and B. Scassellati, “Integrating socially assistive robotics into mental healthcare interventions: Applications and recommendations for expanded use,”Clinical psychology review, vol. 35, pp. 35–46, 2015

  10. [10]

    Artificial emotional intelligence in socially assistive robots for older adults: a pilot study,

    H. Abdollahi, M. H. Mahoor, R. Zandie, J. Siewierski, and S. H. Qualls, “Artificial emotional intelligence in socially assistive robots for older adults: a pilot study,”IEEE Transactions on Affective Computing, vol. 14, no. 3, pp. 2020–2032, 2022

  11. [11]

    Children conform, adults resist: A robot group induced peer pressure on normative social conformity,

    A.-L. V ollmer, R. Read, D. Trippas, and T. Belpaeme, “Children conform, adults resist: A robot group induced peer pressure on normative social conformity,”Science robotics, vol. 3, no. 21, p. eaat7111, 2018

  12. [12]

    Human-robot interaction in rehabilitation and assistance: a review,

    A. Mohebbi, “Human-robot interaction in rehabilitation and assistance: a review,”Current Robotics Reports, vol. 1, no. 3, pp. 131–144, 2020

  13. [13]

    Soft, wearable robotics and haptics: Technologies, trends, and emerging applications,

    M. Zhu, S. Biswas, S. I. Dinulescu, N. Kastor, E. W. Hawkes, and Y . Visell, “Soft, wearable robotics and haptics: Technologies, trends, and emerging applications,”Proceedings of the IEEE, vol. 110, no. 2, pp. 246–272, 2022

  14. [14]

    Wearable soft technologies for haptic sensing and feedback,

    J. Yin, R. Hinchet, H. Shea, and C. Majidi, “Wearable soft technologies for haptic sensing and feedback,”Advanced Functional Materials, vol. 31, no. 39, p. 2007428, 2021

  15. [15]

    Conveying emotions to robots through touch and sound,

    Q. Ren, R. Proesmans, F. Bossuyt, J. Vanfleteren, F. Wyffels, and T. Belpaeme, “Conveying emotions to robots through touch and sound,” inInternational Conference on Social Robotics, pp. 329–339, Springer, 2024

  16. [16]

    Touch speaks, sound feels: A multimodal approach to affective and social touch from robots to humans,

    Q. Ren and T. Belpaeme, “Touch speaks, sound feels: A multimodal approach to affective and social touch from robots to humans,”arXiv preprint arXiv:2508.07839, 2025

  17. [17]

    What can a robot’s skin be? designing texture- changing skin for human–robot social interaction,

    Y . Hu and G. Hoffman, “What can a robot’s skin be? designing texture- changing skin for human–robot social interaction,”ACM Transactions on Human-Robot Interaction, vol. 12, no. 2, pp. 1–19, 2023

  18. [18]

    A 2d vibration array as an assistive device for visually impaired,

    D. Dakopoulos, S. K. Boddhu, and N. Bourbakis, “A 2d vibration array as an assistive device for visually impaired,” in2007 IEEE 7th International Symposium on BioInformatics and BioEngineering, pp. 930–937, IEEE, 2007

  19. [19]

    Force-sensitive interface engineering in flexible pressure sensors: A review,

    G. Tai, D. Wei, M. Su, P. Li, L. Xie, and J. Yang, “Force-sensitive interface engineering in flexible pressure sensors: A review,”Sensors, vol. 22, no. 7, p. 2652, 2022

  20. [20]

    Conveying emo- tions through device-initiated touch,

    M. Teyssier, G. Bailly, C. Pelachaud, and E. Lecolinet, “Conveying emo- tions through device-initiated touch,”IEEE Transactions on Affective Computing, vol. 13, no. 3, pp. 1477–1488, 2020

  21. [21]

    What kinds of robot’s touch will match expressed emotions?,

    X. Zheng, M. Shiomi, T. Minato, and H. Ishiguro, “What kinds of robot’s touch will match expressed emotions?,”IEEE Robotics and Automation Letters, vol. 5, no. 1, pp. 127–134, 2019

  22. [22]

    Tactile interaction with social robots influ- ences attitudes and behaviour,

    Q. Ren and T. Belpaeme, “Tactile interaction with social robots influ- ences attitudes and behaviour,”International Journal of Social Robotics, vol. 16, no. 11, pp. 2297–2317, 2024

  23. [23]

    Design and evaluation of a touch-centered calming interaction with a social robot,

    Y . S. Sefidgar, K. E. MacLean, S. Yohanan, H. M. Van der Loos, E. A. Croft, and E. J. Garland, “Design and evaluation of a touch-centered calming interaction with a social robot,”IEEE Transactions on Affective Computing, vol. 7, no. 2, pp. 108–121, 2015

  24. [24]

    Touching a mechanical body: tactile contact with body parts of a humanoid robot is physiologically arousing,

    J. J. Li, W. Ju, and B. Reeves, “Touching a mechanical body: tactile contact with body parts of a humanoid robot is physiologically arousing,” Journal of Human-Robot Interaction, vol. 6, no. 3, pp. 118–130, 2017

  25. [25]

    How people with dementia perceive a therapeutic robot called paro in relation to their pain and mood: A qualitative study,

    L. Pu, W. Moyle, and C. Jones, “How people with dementia perceive a therapeutic robot called paro in relation to their pain and mood: A qualitative study,”Journal of clinical nursing, vol. 29, no. 3-4, pp. 437– 446, 2020

  26. [26]

    Robots as intentional agents: using neuroscientific methods to make robots appear more social,

    E. Wiese, G. Metta, and A. Wykowska, “Robots as intentional agents: using neuroscientific methods to make robots appear more social,” Frontiers in psychology, vol. 8, p. 1663, 2017

  27. [27]

    What makes people empathize with an emotional robot?: The impact of agency and physical embodiment on human empathy for a robot,

    S. S. Kwak, Y . Kim, E. Kim, C. Shin, and K. Cho, “What makes people empathize with an emotional robot?: The impact of agency and physical embodiment on human empathy for a robot,” in2013 IEEE Ro-man, pp. 180–185, IEEE, 2013

  28. [28]

    How to touch humans: Guidelines for social agents and robots that can touch,

    J. B. Van Erp and A. Toet, “How to touch humans: Guidelines for social agents and robots that can touch,” in2013 humaine association conference on affective computing and intelligent interaction, pp. 780– 785, IEEE, 2013

  29. [29]

    Responses to robot social roles and social role framing,

    V . Groom, V . Srinivasan, C. L. Bethel, R. Murphy, L. Dole, and C. Nass, “Responses to robot social roles and social role framing,” in2011 International Conference on Collaboration Technologies and Systems (CTS), pp. 194–203, IEEE, 2011

  30. [30]

    Social robots on a global stage: establishing a role for culture during human–robot interaction,

    V . Lim, M. Rooksby, and E. S. Cross, “Social robots on a global stage: establishing a role for culture during human–robot interaction,” International Journal of Social Robotics, vol. 13, no. 6, pp. 1307–1333, 2021

  31. [31]

    Motions with emotions?: A phenomenological ap- proach to understanding the simulated aliveness of a robot body,

    J. Parviainen, L. Van Aerschot, T. Särkikoski, S. Pekkarinen, H. Melkas, and L. Hennala, “Motions with emotions?: A phenomenological ap- proach to understanding the simulated aliveness of a robot body,” Techné: Research in Philosophy and Technology, no. 3, 2019

  32. [32]

    Social robots: Things or agents?,

    M. Ala ˇc, “Social robots: Things or agents?,”AI & society, vol. 31, no. 4, pp. 519–535, 2016

  33. [33]

    Touch challenge’15: Recognizing social touch gestures,

    M. M. Jung, X. L. Cang, M. Poel, and K. E. MacLean, “Touch challenge’15: Recognizing social touch gestures,” inProceedings of the 2015 ACM on International Conference on Multimodal Interaction, pp. 387–390, 2015

  34. [34]

    Embodiment theory,

    A. Borghi, F. Caruana,et al., “Embodiment theory,” inInternational encyclopedia of the social & behavioral sciences, vol. 7, pp. 420–426, Elsevier, 2015

  35. [35]

    Touch communicates distinct emotions.,

    M. J. Hertenstein, D. Keltner, B. App, B. A. Bulleit, and A. R. Jaskolka, “Touch communicates distinct emotions.,”Emotion, vol. 6, no. 3, p. 528, 2006

  36. [36]

    More than one kind: Different sensory signatures and functions divide affectionate touch.,

    A. Schirmer, M. H. Chiu, and I. Croy, “More than one kind: Different sensory signatures and functions divide affectionate touch.,”Emotion, vol. 21, no. 6, p. 1268, 2021

  37. [37]

    Affective touch in human–robot interaction: conveying emotion to the nao robot,

    R. Andreasson, B. Alenljung, E. Billing, and R. Lowe, “Affective touch in human–robot interaction: conveying emotion to the nao robot,” International Journal of Social Robotics, vol. 10, no. 4, pp. 473–491, 2018. 15

  38. [38]

    Interpersonal distance, body orientation, and touch: Effects of culture, gender, and age,

    M. S. Remland, T. S. Jones, and H. Brinkman, “Interpersonal distance, body orientation, and touch: Effects of culture, gender, and age,”The Journal of social psychology, vol. 135, no. 3, pp. 281–297, 1995

  39. [39]

    Affective interpersonal touch in close relationships: A cross-cultural perspective,

    A. Sorokowska, S. Saluja, P. Sorokowski, T. Fr ˛ ackowiak, M. Karwowski, T. Aavik, G. Akello, C. Alm, N. Amjad, A. Anjum,et al., “Affective interpersonal touch in close relationships: A cross-cultural perspective,” Personality and Social Psychology Bulletin, vol. 47, no. 12, pp. 1705– 1721, 2021

  40. [40]

    Affective and behav- ioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human–robot interaction,

    C. J. Willemse, A. Toet, and J. B. Van Erp, “Affective and behav- ioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human–robot interaction,”Frontiers in ICT, vol. 4, p. 12, 2017

  41. [41]

    Affective interaction and affective computing-past, present and future,

    N. Ahmadpour, D. Lottridge, J. Fritsch, C. Sas, M. E. Cecchinato, D. Harrison, K. Höök, P. S. Foong, K. Ijaz, P. Gough,et al., “Affective interaction and affective computing-past, present and future,” inPro- ceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, pp. 1–6, 2025

  42. [42]

    Evaluating the effects of active social touch and robot expressiveness on user attitudes and behaviour in human–robot interaction,

    J. J. Gamboa-Montero, S. Carrasco-Martinez, E. Fernandez-Rodicio, F. Alonso-Martin, and J. C. Castillo, “Evaluating the effects of active social touch and robot expressiveness on user attitudes and behaviour in human–robot interaction,”Scientific Reports, vol. 15, no. 1, p. 18483, 2025

  43. [43]

    Emotion recognition using affective touch: A survey,

    E. Y . Zhang, Z. Pan, and A. D. Cheok, “Emotion recognition using affective touch: A survey,”IEEE Transactions on Affective Computing, 2025

  44. [44]

    Skin-inspired textile-based tactile sensors enable multifunc- tional sensing of wearables and soft robots,

    Y . Pang, X. Xu, S. Chen, Y . Fang, X. Shi, Y . Deng, Z.-L. Wang, and C. Cao, “Skin-inspired textile-based tactile sensors enable multifunc- tional sensing of wearables and soft robots,”Nano Energy, vol. 96, p. 107137, 2022

  45. [45]

    Touching the sound: audible features enable haptics for robot control,

    H. Shi, M. Russo, J. de la Torre, A. Mohammad, X. Dong, and D. Axinte, “Touching the sound: audible features enable haptics for robot control,” IEEE Robotics & Automation Magazine, vol. 30, no. 3, pp. 56–68, 2022

  46. [46]

    Topography of social touching depends on emotional bonds between humans,

    J. T. Suvilehto, E. Glerean, R. I. Dunbar, R. Hari, and L. Nummenmaa, “Topography of social touching depends on emotional bonds between humans,”Proceedings of the National Academy of Sciences, vol. 112, no. 45, pp. 13811–13816, 2015

  47. [47]

    Touching a mechanical body: The role of anthropomorphic framing in physiological arousal when touching a robot,

    K. Maj, P. Grzybowicz, W. L. Drela, and M. Olszanowski, “Touching a mechanical body: The role of anthropomorphic framing in physiological arousal when touching a robot,”Sensors, vol. 23, no. 13, p. 5954, 2023

  48. [48]

    Feel-good robotics: requirements on touch for embodiment in assistive robotics,

    P. Beckerle, R. Kõiva, E. A. Kirchner, R. Bekrater-Bodmann, S. Dosen, O. Christ, D. A. Abbink, C. Castellini, and B. Lenggenhager, “Feel-good robotics: requirements on touch for embodiment in assistive robotics,” Frontiers in neurorobotics, vol. 12, p. 84, 2018

  49. [49]

    Automatic kinematic chain calibration using artificial skin: self-touch in the icub humanoid robot,

    A. Roncone, M. Hoffmann, U. Pattacini, and G. Metta, “Automatic kinematic chain calibration using artificial skin: self-touch in the icub humanoid robot,” in2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 2305–2312, IEEE, 2014

  50. [50]

    Peripersonal space and margin of safety around the body: learning visuo-tactile associations in a humanoid robot with artificial skin,

    A. Roncone, M. Hoffmann, U. Pattacini, L. Fadiga, and G. Metta, “Peripersonal space and margin of safety around the body: learning visuo-tactile associations in a humanoid robot with artificial skin,”PloS one, vol. 11, no. 10, p. e0163713, 2016

  51. [51]

    The role of social norms in human–robot interaction: A systematic review,

    S. Lawrence, M. Jouaiti, J. Hoey, C. L. Nehaniv, and K. Dautenhahn, “The role of social norms in human–robot interaction: A systematic review,”ACM Transactions on Human-Robot Interaction, vol. 14, no. 3, pp. 1–44, 2025

  52. [52]

    Human response to humanoid robot that responds to social touch,

    M. Okuda, Y . Takahashi, and S. Tsuichihara, “Human response to humanoid robot that responds to social touch,”Applied Sciences, vol. 12, no. 18, p. 9193, 2022

  53. [53]

    Touch and tell: Multimodal decoding of human emotions and social gestures for robots,

    Q. Ren, R. Proesmans, Y . Hou, T. Belpaeme,et al., “Touch and tell: Multimodal decoding of human emotions and social gestures for robots,” arXiv preprint arXiv:2412.03300, 2024

  54. [54]

    What are emotions? and how can they be measured?,

    K. R. Scherer, “What are emotions? and how can they be measured?,” Social science information, vol. 44, no. 4, pp. 695–729, 2005

  55. [55]

    Uncovering human-to-human physical interactions that underlie emo- tional and affective touch communication,

    S. C. Hauser, S. McIntyre, A. Israr, H. Olausson, and G. J. Gerling, “Uncovering human-to-human physical interactions that underlie emo- tional and affective touch communication,” in2019 IEEE world haptics conference (WHC), pp. 407–412, IEEE, 2019

  56. [56]

    Dimensions of mind perception,

    H. M. Gray, K. Gray, and D. M. Wegner, “Dimensions of mind perception,”science, vol. 315, no. 5812, pp. 619–619, 2007

  57. [57]

    Universal dimensions of social cognition: Warmth and competence,

    S. T. Fiske, A. J. Cuddy, and P. Glick, “Universal dimensions of social cognition: Warmth and competence,”Trends in cognitive sciences, vol. 11, no. 2, pp. 77–83, 2007

  58. [58]

    Social categorization of social robots: Anthropomorphism as a function of robot group membership,

    F. Eyssel and D. Kuchenbrandt, “Social categorization of social robots: Anthropomorphism as a function of robot group membership,”British Journal of Social Psychology, vol. 51, no. 4, pp. 724–731, 2012

  59. [59]

    Inclusion of other in the self scale and the structure of interpersonal closeness.,

    A. Aron, E. N. Aron, and D. Smollan, “Inclusion of other in the self scale and the structure of interpersonal closeness.,”Journal of personality and social psychology, vol. 63, no. 4, p. 596, 1992

  60. [60]

    D. F. Alwin,Margins of error: A study of reliability in survey measure- ment. John Wiley & Sons, 2007

  61. [61]

    Coefficient alpha and the internal structure of tests,

    L. J. Cronbach, “Coefficient alpha and the internal structure of tests,” psychometrika, vol. 16, no. 3, pp. 297–334, 1951

  62. [62]

    The sense of agency in perception, behaviour and human–machine interactions,

    W. Wen and H. Imamizu, “The sense of agency in perception, behaviour and human–machine interactions,”Nature Reviews Psychology, vol. 1, no. 4, pp. 211–222, 2022

  63. [63]

    Social touch gesture recognition using convolutional neural network,

    S. Albawi, O. Bayat, S. Al-Azawi, and O. N. Ucan, “Social touch gesture recognition using convolutional neural network,”Computational Intelligence and Neuroscience, vol. 2018, no. 1, p. 6973103, 2018