pith. machine review for the scientific record. sign in

arxiv: 2604.28156 · v1 · submitted 2026-04-30 · 💻 cs.RO · cs.AI· cs.LG

Recognition: unknown

FlexiTac: A Low-Cost, Open-Source, Scalable Tactile Sensing Solution for Robotic Systems

Authors on Pith no claims yet

Pith reviewed 2026-05-07 06:51 UTC · model grok-4.3

classification 💻 cs.RO cs.AIcs.LG
keywords tactile sensingrobotic grippersflexible sensorspiezoresistive sensingopen-source hardwarevisuo-tactile fusionsensor fabricationrobot learning
0
0 comments X

The pith

A sealed three-layer laminate delivers affordable, repeatable tactile sensing to robotic grippers.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper presents FlexiTac, a low-cost open-source tactile sensing solution for robotic end-effectors that uses thin flexible pads and a compact readout board. It establishes that the FPC-Velostat-FPC three-layer laminate with integrated electrodes substantially improves fabrication throughput and repeatability while keeping the sensors mechanically compliant for use on rigid and soft grippers. The system streams synchronized tactile measurements at 100 Hz and demonstrates utility in modern learning setups such as visuo-tactile fusion for decision making and cross-embodiment skill transfer. A sympathetic reader would care because high costs and fabrication difficulties have limited dense tactile sensing in robotics, and a practical scalable alternative could change that. If the approach works as described, it would allow easier integration of touch sensing into robotic systems for improved control and data collection.

Core claim

FlexiTac consists of flexible tactile sensor pads using a sealed three-layer laminate stack of FPC-Velostat-FPC with electrode patterns integrated into the flexible printed circuits, paired with a compact multi-channel readout board that streams synchronized measurements at 100 Hz via serial communication. This design improves fabrication and repeatability, maintains compliance for deployment on diverse grippers, and supports tactile learning pipelines including 3D visuo-tactile fusion, cross-embodiment skill transfer, and real-to-sim-to-real fine-tuning with GPU-parallel simulation.

What carries the argument

The sealed three-layer laminate stack (FPC-Velostat-FPC) with electrode patterns directly integrated into flexible printed circuits, which enables improved fabrication throughput, repeatability, and mechanical compliance for tactile sensing on robotic systems.

Load-bearing premise

That the three-layer laminate and low-cost readout electronics will deliver dense, synchronized, and repeatable tactile signals across varied mounting configurations and real-world use without requiring extensive per-unit calibration or suffering from mechanical or electrical drift.

What would settle it

Finding that fabricated sensor units show large variations in sensitivity or that signals drift significantly after repeated mounting and flexing on a gripper, or that synchronization cannot be maintained at 100 Hz with multiple connected pads.

Figures

Figures reproduced from arXiv: 2604.28156 by Binghao Huang, Yunzhu Li.

Figure 1
Figure 1. Figure 1: FlexiTac deployments across diverse platforms. We showcase FlexiTac integrated on multiple robot end-effectors and a wearable collector, spanning tabletop manipulation, bimanual coordination, mobile manipulation, and in-the-wild data acquisition. The breadth of deployments highlights the sensor’s conformability, modularity, and plug-and-play integration. arXiv:2604.28156v1 [cs.RO] 30 Apr 2026 view at source ↗
Figure 2
Figure 2. Figure 2: FlexiTac System Configurations. (a) Example assembly of a 32×12 tactile pad integrated on a soft fin-shaped gripper, connected via a flexible flat cable (FFC) to a multi-channel readout board (with an Arduino Nano). (b) Close-up of the 32×12 pad geometry. (c) A larger 32×32 tactile mat configuration with its corresponding readout board. (d-e) Additional compact pad form factors (8×16 and 16×16) illustratin… view at source ↗
Figure 3
Figure 3. Figure 3: FlexiTac construction and electrode layout. (a) Exploded view of the sealed three-layer stack-up: top FPC electrodes, piezoresistive film (Velostat), and bottom FPC electrodes, encapsulated by laminating sheets, with stiffeners or golden fingers for reliable electrical interfacing. (b-c) Top and bottom FPC electrode designs forming an orthogonal sensing matrix; each electrode intersection defines one taxel… view at source ↗
Figure 4
Figure 4. Figure 4: Multi-channel readout electronics for scalable tactile acquisition. (a) 32×32 readout board assembly. (b) 32×16 readout board assembly. (c) Layout and key components of the 32×16 board, including shift registers and a multiplexer for efficient addressing of high-dimensional tactile matrices with minimal wiring. The board streams synchronized tactile measurements to a host computer (100 Hz) via serial commu… view at source ↗
Figure 5
Figure 5. Figure 5: 3D visuo-tactile fusion for contact-aware policies (3D-ViTac pipeline). (a) Real￾world environment with multimodal observations. (b) Processing pipeline: multi-view RGB-D is reconstructed into a 3D visual point cloud, while tactile signals are transformed into 3D space using robot proprioception, attaching tactile signals as a feature. (c) Unified 3D visuo-tactile point representation by merging visual and… view at source ↗
Figure 6
Figure 6. Figure 6: Cross-embodiment tactile sensing with a portable device and robot. (a) In-the-wild data collection using a portable visuo-tactile gripper. (b) Portable device details showing a fisheye camera, FlexiTac pads, and a compact readout board for synchronized logging. (c) Robot deployment on an xArm 850 equipped with FlexiTac. The shared sensing module supports consistent tactile signals across embodiments, enabl… view at source ↗
Figure 7
Figure 7. Figure 7: Real-to-sim-to-real learning pipeline with tactile simulation. Overview of a pipeline that combines real-world visuo-tactile demonstrations with GPU-parallel tactile simulation for scalable RL fine-tuning. Tactile signals are simulated at the taxel level to match the real sensor layout, enabling pre-training (e.g., diffusion policy), simulation-based fine-tuning on diverse assembly tasks (e.g., nut-and-bol… view at source ↗
read the original abstract

We present FlexiTac, a low-cost, open-source, and scalable piezoresistive tactile sensing solution designed for robotic end-effectors. FlexiTac is a practical "plug-in" module consisting of (i) thin, flexible tactile sensor pads that provide dense tactile signals and (ii) a compact multi-channel readout board that streams synchronized measurements for real-time control and large-scale data collection. FlexiTac pads adopt a sealed three-layer laminate stack (FPC-Velostat-FPC) with electrode patterns directly integrated into flexible printed circuits, substantially improving fabrication throughput and repeatability while maintaining mechanical compliance for deployment on both rigid and soft grippers. The readout electronics use widely available, low-cost components and stream tactile signals to a host computer at 100 Hz via serial communication. Across multiple configurations, including fingertip pads and larger tactile mats, FlexiTac can be mounted on diverse platforms without major mechanical redesign. We further show that FlexiTac supports modern tactile learning pipelines, including 3D visuo-tactile fusion for contact-aware decision making, cross-embodiment skill transfer, and real-to-sim-to-real fine-tuning with GPU-parallel tactile simulation. Our project page is available at https://flexitac.github.io/.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper presents FlexiTac, a low-cost, open-source tactile sensing system for robotic end-effectors consisting of thin flexible sensor pads based on a sealed three-layer FPC-Velostat-FPC laminate with integrated electrode patterns and a compact multi-channel readout board that streams synchronized tactile signals at 100 Hz via serial communication. The design is claimed to improve fabrication throughput and repeatability while preserving mechanical compliance for mounting on both rigid and soft grippers. The work further asserts compatibility with modern tactile learning pipelines, including 3D visuo-tactile fusion for contact-aware decisions, cross-embodiment skill transfer, and real-to-sim-to-real fine-tuning.

Significance. If the performance and integration claims are substantiated, FlexiTac could lower barriers to tactile sensing in robotics by providing an accessible, scalable hardware platform with open-source components and simulation support. The emphasis on mechanical compliance across gripper types and real-time streaming at 100 Hz addresses practical needs in manipulation research. The open-source release and explicit support for visuo-tactile and sim-to-real pipelines are clear strengths that could accelerate community adoption.

major comments (2)
  1. [§4 (Sensor Design)] §4 (Sensor Design): The central claim that the sealed three-layer FPC-Velostat-FPC laminate 'substantially improving fabrication throughput and repeatability' while mitigating known Velostat issues (hysteresis, temperature sensitivity, fatigue) is not supported by any quantitative characterization data such as multi-cycle resistance-pressure curves, drift measurements over time, or temperature variation tests. This directly bears on the 'plug-and-play' and scalability assertions.
  2. [§6 (Experiments and Integration)] §6 (Experiments and Integration): The demonstrations of support for tactile learning pipelines (visuo-tactile fusion, cross-embodiment transfer, real-to-sim-to-real) are described at a high level with no reported metrics, number of trials, success rates, error bars, or comparisons to baseline sensors. Without such data, it is not possible to evaluate whether the 100 Hz synchronized streaming delivers the dense, repeatable signals required for the claimed applications.
minor comments (2)
  1. The manuscript references a project page but does not include an explicit statement on the availability of CAD files, PCB layouts, bill of materials, or firmware code within the text itself.
  2. Figure captions for the laminate stack and readout board could more explicitly label the three-layer construction and electrode patterns to aid readers in understanding the fabrication improvements.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their thorough review and constructive comments. We address each major comment below and indicate the revisions we will make to strengthen the manuscript.

read point-by-point responses
  1. Referee: §4 (Sensor Design): The central claim that the sealed three-layer FPC-Velostat-FPC laminate 'substantially improving fabrication throughput and repeatability' while mitigating known Velostat issues (hysteresis, temperature sensitivity, fatigue) is not supported by any quantitative characterization data such as multi-cycle resistance-pressure curves, drift measurements over time, or temperature variation tests. This directly bears on the 'plug-and-play' and scalability assertions.

    Authors: We acknowledge that the manuscript lacks quantitative characterization of the sensor's performance metrics such as hysteresis, drift, and temperature sensitivity. While the design choices are intended to address these issues through the sealed laminate structure, we agree that empirical data is necessary to substantiate the claims of improved fabrication throughput, repeatability, and mitigation of Velostat drawbacks. In the revised manuscript, we will include results from multi-cycle resistance-pressure tests, long-term drift measurements, and temperature variation experiments to provide the required quantitative support for the scalability and plug-and-play assertions. revision: yes

  2. Referee: §6 (Experiments and Integration): The demonstrations of support for tactile learning pipelines (visuo-tactile fusion, cross-embodiment transfer, real-to-sim-to-real) are described at a high level with no reported metrics, number of trials, success rates, error bars, or comparisons to baseline sensors. Without such data, it is not possible to evaluate whether the 100 Hz synchronized streaming delivers the dense, repeatable signals required for the claimed applications.

    Authors: We recognize that the current description of the integration experiments is high-level and lacks specific quantitative metrics. The manuscript focuses on demonstrating the feasibility of using FlexiTac in various learning pipelines, but to allow proper evaluation of the sensor's performance in these contexts, we will expand this section in the revision. This will include details on the number of trials, success rates, error bars, and where possible, comparisons to other sensors or baselines, along with confirmation that the 100 Hz streaming provides sufficient data density and repeatability for the applications. revision: yes

Circularity Check

0 steps flagged

No circularity: hardware design paper with no derivations or fitted predictions

full rationale

The paper describes a physical tactile sensor (FPC-Velostat-FPC laminate, readout board, 100 Hz streaming) and its fabrication/usage for robotic applications. No equations, parameter fittings, uniqueness theorems, or self-referential derivations appear in the abstract or described content. Performance claims rest on engineering choices and empirical use cases (visuo-tactile fusion, sim-to-real), not on any step that reduces to its own inputs by construction. This matches the reader's 0.0 assessment; the skeptic concerns address empirical robustness rather than logical circularity.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

No mathematical model, derivations, or new physical entities are introduced; the work is an engineering description of a sensor stack and electronics. No free parameters, axioms, or invented entities appear in the abstract.

pith-pipeline@v0.9.0 · 5532 in / 1256 out tokens · 75844 ms · 2026-05-07T06:51:14.027809+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

51 extracted references · 27 canonical work pages · 2 internal anchors

  1. [1]

    Jenmalm and R

    P. Jenmalm and R. S. Johansson. Visual and somatosensory information about object shape control manipulative fingertip forces.Journal of Neuroscience, 17(11):4486–4499, 1997. doi:10.1523/JNEUROSCI.17-11-04486.1997

  2. [2]

    R. S. Johansson and G. Westling. Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects.Experimental Brain Research, 56:550–564, 1984

  3. [3]

    Calandra, A

    R. Calandra, A. Owens, M. Upadhyaya, W. Yuan, J. Lin, E. H. Adelson, and S. Levine. The feeling of success: Does touch sensing help predict grasp outcomes? InProceedings of the Conference on Robot Learning, 2017

  4. [4]

    Calandra, A

    R. Calandra, A. Owens, D. Jayaraman, J. Lin, W. Yuan, J. Malik, E. H. Adelson, and S. Levine. More than a feeling: Learning to grasp and regrasp using vision and touch.IEEE Robotics and Automation Letters, 3(4):3300–3307, 2018. doi:10.1109/LRA.2018.2852779

  5. [5]

    M. A. Lee, Y . Zhu, K. Srinivasan, P. Shah, S. Savarese, L. Fei-Fei, A. Garg, and J. Bohg. Making sense of vision and touch: Self-supervised learning of multimodal representations for contact-rich tasks. InProceedings of the IEEE International Conference on Robotics and Automation, pages 8943–8950, 2019. doi:10.1109/ICRA.2019.8793485

  6. [6]

    Huang, Y

    B. Huang, Y . Wang, X. Yang, Y . Luo, and Y . Li. 3d-vitac: Learning fine-grained manipulation with visuo-tactile sensing. In8th Annual Conference on Robot Learning, 2024

  7. [7]

    Sunil, S

    N. Sunil, S. Wang, Y . She, E. Adelson, and A. Rodriguez. Visuotactile affordances for cloth manipulation with local control. InConference on Robot Learning, pages 1596–1606. PMLR, 2023

  8. [8]

    H. Chen, J. Xu, H. Chen, K. Hong, B. Huang, C. Liu, J. Mao, Y . Li, Y . Du, and K. Driggs- Campbell. Multi-modal manipulation via multi-modal policy consensus.arXiv preprint arXiv:2509.23468, 2025

  9. [9]

    W. Yuan, S. Dong, and E. H. Adelson. Gelsight: High-resolution robot tactile sensors for estimating geometry and force.Sensors, 17(12):2762, 2017. doi:10.3390/s17122762

  10. [10]

    Lambeta, P.-W

    M. Lambeta, P.-W. Chou, S. Tian, B. Yang, B. Maloon, V . R. Most, D. Stroud, R. Santos, A. Byagowi, G. Kammerer, D. Jayaraman, and R. Calandra. Digit: A novel design for a low-cost compact high-resolution tactile sensor with application to in-hand manipulation.IEEE Robotics and Automation Letters, 5(3):3838–3845, 2020. doi:10.1109/LRA.2020.2977257

  11. [11]

    Ward-Cherrier, N

    B. Ward-Cherrier, N. Pestell, L. Cramphorn, B. Winstone, M. E. Giannaccini, J. Rossiter, and N. F. Lepora. The tactip family: Soft optical tactile sensors with 3d-printed biomimetic morphologies.Soft Robotics, 5(2):216–227, 2018. doi:10.1089/soro.2017.0052

  12. [12]

    Y . Ma, J. Zhao, and E. H. Adelson. Gellink: A compact multi-phalanx finger with vision-based tactile sensing and proprioception.arXiv preprint arXiv:2403.14887, 2024

  13. [13]

    Padmanabha, F

    A. Padmanabha, F. Ebert, S. Tian, R. Calandra, C. Finn, and S. Levine. Omnitact: A multi- directional high-resolution touch sensor. In2020 IEEE International Conference on Robotics and Automation (ICRA), pages 618–624. IEEE, 2020

  14. [14]

    I. H. Taylor, S. Dong, and A. Rodriguez. Gelslim 3.0: High-resolution measurement of shape, force and slip in a compact tactile-sensing finger. In2022 International Conference on Robotics and Automation (ICRA), pages 10781–10787. IEEE, 2022

  15. [15]

    B ¨uscher, M

    G. B ¨uscher, M. Meier, G. Walck, R. Haschke, and H. J. Ritter. Augmenting curved robot surfaces with soft tactile skin. In2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 1514–1519. IEEE, 2015. 9

  16. [16]

    Bhirangi, T

    R. Bhirangi, T. Hellebrekers, C. Majidi, and A. Gupta. Reskin: Versatile, replaceable, lasting tactile skins. InProceedings of the Conference on Robot Learning, 2021

  17. [17]

    Bhirangi, V

    R. Bhirangi, V . Pattabiraman, E. Erciyes, Y . Cao, T. Hellebrekers, and L. Pinto. Anyskin: Plug-and-play skin sensing for robotic touch.arXiv preprint arXiv:2409.08276, 2024

  18. [18]

    Pattabiraman, Z

    V . Pattabiraman, Z. Huang, D. Panozzo, D. Zorin, L. Pinto, and R. Bhirangi. eflesh: Highly customizable magnetic touch sensing using cut-cell microstructures.arXiv preprint arXiv:2506.09994, 2025

  19. [19]

    T. P. Tomo, M. Regoli, A. Schmitz, L. Natale, H. Kristanto, S. Somlor, L. Jamone, G. Metta, and S. Sugano. A new silicone structure for uskin—a soft, distributed, digital 3-axis skin sensor and its integration on the humanoid robot icub.IEEE Robotics and Automation Letters, 3(3): 2584–2591, 2018

  20. [20]

    Bhirangi, A

    R. Bhirangi, A. DeFranco, J. Adkins, C. Majidi, A. Gupta, T. Hellebrekers, and V . Kumar. All the feels: A dexterous hand with large area sensing.arXiv preprint arXiv:2210.15658, 2022

  21. [21]

    Wistreich, B

    S. Wistreich, B. Shi, S. Tian, S. Clarke, M. Nath, C. Xu, Z. Bao, and J. Wu. Dexskin: High- coverage conformable robotic skin for learning contact-rich manipulation. InConference on Robot Learning, 2025. URLhttps://arxiv.org/abs/2509.18830

  22. [22]

    Y . Luo, Y . Li, P. Sharma, W. Shou, K. Wu, M. Foshey, B. Li, T. Palacios, A. Torralba, and W. Matusik. Learning human–environment interactions using conformal tactile textiles.Nature Electronics, 4(3):193–201, 2021. doi:10.1038/s41928-021-00558-0

  23. [23]

    The design of stretch: A compact, lightweight mobile manipulator for indoor human environments,

    L. Zlokapa, Y . Luo, J. Xu, M. Foshey, K. Wu, P. Agrawal, and W. Matusik. An integrated design pipeline for tactile sensing robotic manipulators. InProceedings of the IEEE International Conference on Robotics and Automation, pages 3137–3144, 2022. doi:10.1109/ICRA46639. 2022.9812335

  24. [24]

    Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems

    D. Murphy, Y . Li, C. E. Owens, L. Stanton, P. P. Liang, Y . Luo, A. Torralba, and W. Matusik. Fits like a flex-glove: Automatic design of personalized fpcb-based tactile sensing gloves. In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, CHI EA ’25. Association for Computing Machinery, 2025. doi:10.1145/3...

  25. [25]

    Sundaram, P

    S. Sundaram, P. Kellnhofer, Y . Li, J.-Y . Zhu, A. Torralba, and W. Matusik. Learning the signatures of the human grasp using a scalable tactile glove.Nature, 569(7758):698–702, 2019. doi:10.1038/s41586-019-1234-z

  26. [26]

    T. Lin, Y . Zhang, Q. Li, H. Qi, B. Yi, S. Levine, and J. Malik. Learning visuotactile skills with two multifingered hands.arXiv preprint arXiv:2404.16823, 2024

  27. [27]

    Y . Yuan, H. Che, Y . Qin, B. Huang, Z.-H. Yin, K.-W. Lee, Y . Wu, S.-N. Lim, and X. Wang. Robot synesthesia: In-hand manipulation with visuotactile sensing.arXiv preprint arXiv:2312.01853, 2023

  28. [28]

    Falco, S

    P. Falco, S. Lu, A. Cirillo, C. Natale, S. Pirozzi, and D. Lee. Cross-modal visuo-tactile object recognition using robotic active exploration. In2017 IEEE International Conference on Robotics and Automation (ICRA), pages 5273–5280. IEEE, 2017

  29. [29]

    Guzey, B

    I. Guzey, B. Evans, S. Chintala, and L. Pinto. Dexterity from touch: Self-supervised pre-training of tactile representations with robotic play, 2023

  30. [30]

    V . Dave, F. Lygerakis, and E. Rueckert. Multimodal visual-tactile representation learning through self-supervised contrastive pre-training.arXiv preprint arXiv:2401.12024, 2024

  31. [31]

    George, S

    A. George, S. Gano, P. Katragadda, and A. B. Farimani. Visuo-tactile pretraining for cable plugging.arXiv preprint arXiv:2403.11898, 2024. 10

  32. [32]

    Zhang, H

    K. Zhang, H. Zhang, Z. Xu, Z. Zhang, M. R. I. Prince, X. Li, X. Han, Y . Zhou, A. Ajoudani, and Y . She. Tacvla: Contact-aware tactile fusion for robust vision-language-action manipulation. arXiv preprint arXiv:2603.12665, 2026

  33. [33]

    N. Tao, Y . He, W. Maa, B. Huang, and Y . Li. LeFlexiTac: Giving robots a sense of touch.Columbia University RoboPIL Blog, 2026. https://tna001-ai.github.io/ tactile-lerobot-website/

  34. [34]

    van Hoof, N

    H. van Hoof, N. Chen, M. Karl, P. van der Smagt, and J. Peters. Stable reinforcement learning with autoencoders for tactile and visual data. In2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 3928–3934. IEEE, 2016

  35. [35]

    Hansen, F

    J. Hansen, F. Hogan, D. Rivkin, D. Meger, M. Jenkin, and G. Dudek. Visuotactile-rl: Learning multimodal manipulation policies with deep reinforcement learning. In2022 International Conference on Robotics and Automation (ICRA), pages 8298–8304. IEEE, 2022

  36. [36]

    Y . Chen, M. Van der Merwe, A. Sipos, and N. Fazeli. Visuo-tactile transformers for manipulation. In6th Annual Conference on Robot Learning, 2022

  37. [37]

    Y . Qin, B. Huang, Z.-H. Yin, H. Su, and X. Wang. Dexpoint: Generalizable point cloud reinforcement learning for sim-to-real dexterous manipulation.Conference on Robot Learning (CoRL), 2022

  38. [38]

    C. Chi, Z. Xu, C. Pan, E. Cousineau, B. Burchfiel, S. Feng, R. Tedrake, and S. Song. Universal manipulation interface: In-the-wild robot teaching without in-the-wild robots.arXiv preprint arXiv:2402.10329, 2024

  39. [39]

    C. Wang, H. Shi, W. Wang, R. Zhang, L. Fei-Fei, and C. K. Liu. Dexcap: Scalable and portable mocap data collection system for dexterous manipulation.arXiv preprint arXiv:2403.07788, 2024

  40. [40]

    X. Zhu, B. Huang, and Y . Li. Touch in the wild: Learning fine-grained manipulation with a portable visuo-tactile gripper. InThe Thirty-ninth Annual Conference on Neural Information Processing Systems, 2025. URLhttps://openreview.net/forum?id=WabVVQKTUF

  41. [41]

    X. Kang, T. Tian, S.-W. Lee, B. Huang, Y . Li, and Y .-L. Kuo. Learning force-regulated manipulation with a low-cost tactile-force-controlled gripper.arXiv preprint arXiv:2602.10013, 2026

  42. [42]

    Y . S. Narang, B. Sundaralingam, M. Macklin, A. Mousavian, and D. Fox. Sim-to-real for robotic tactile sensing via physics-based simulation and learned latent projections. In2021 IEEE International Conference on Robotics and Automation (ICRA), pages 6444–6451. IEEE, 2021

  43. [43]

    T. Bi, C. Sferrazza, and R. D’Andrea. Zero-shot sim-to-real transfer of tactile control policies for aggressive swing-up manipulation.IEEE Robotics and Automation Letters, 6(3):5761–5768,

  44. [44]

    doi:10.1109/LRA.2021.3084889

  45. [45]

    Church, J

    A. Church, J. Lloyd, R. Hadsell, and N. F. Lepora. Tactile sim-to-real policy transfer via real-to-sim image translation. InProceedings of the Conference on Robot Learning, 2022

  46. [46]

    E. Su, C. Jia, Y . Qin, W. Zhou, A. Macaluso, B. Huang, and X. Wang. Sim2real manip- ulation on unknown objects with tactile-based reinforcement learning. In2024 IEEE In- ternational Conference on Robotics and Automation (ICRA), pages 9234–9241, 2024. doi: 10.1109/ICRA57147.2024.10611113

  47. [47]

    Z.-H. Yin, B. Huang, Y . Qin, Q. Chen, and X. Wang. Rotating without seeing: Towards in-hand dexterity through touch.arXiv preprint arXiv:2303.10880, 2023. 11

  48. [48]

    T. Pang, H. J. T. Suh, L. Yang, and R. Tedrake. Global planning for contact-rich manipulation via local smoothing of quasi-dynamic contact models, 2022

  49. [49]

    Oller, D

    M. Oller, D. Berenson, and N. Fazeli. Tactilevad: Geometric aliasing-aware dynamics for high-resolution tactile control. InProceedings of The 7th Conference on Robot Learning, volume 229 ofProceedings of Machine Learning Research, pages 3083–3099. PMLR, 2023

  50. [50]

    Y . Zhou, W. S. Lee, Y . Gu, and Y . She. Tactile-reactive gripper with an active palm for dexterous manipulation.npj Robotics, 4(1):13, 2026

  51. [51]

    Huang, J

    B. Huang, J. Xu, I. Akinola, W. Yang, B. Sundaralingam, R. O’Flaherty, D. Fox, X. Wang, A. Mousavian, Y .-W. Chao, and Y . Li. VT-refine: Learning bimanual assembly with visuo-tactile feedback via simulation fine-tuning. In9th Annual Conference on Robot Learning, 2025. URL https://openreview.net/forum?id=bOVF8Rj33i. 12