Recognition: unknown
Beyond Silicon: Materials, Mechanisms, and Methods for Physical Neural Computing
Pith reviewed 2026-05-10 16:08 UTC · model grok-4.3
The pith
Physical neural computing substrates each occupy distinct performance regimes rather than one dominating all tasks.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
By analyzing architectural paradigms and constraints across substrates, the paper establishes that no single physical neural system excels universally; instead, they fill complementary niches in speed, efficiency, and integration, as evidenced by evaluations on standardized static and dynamic tasks.
What carries the argument
A first-order benchmarking scheme using standardized static and dynamic tasks evaluated along physically interpretable dimensions such as scalability, precision, and interfacing overhead.
If this is right
- Physical neural systems enable co-location of sensing, memory, and computation to minimize data movement in edge AI applications.
- Applications in ultrafast signal processing can leverage photonic or mechanical substrates.
- In-memory inference benefits from memristive devices.
- Embodied control and biochemical decision making suit mechanical metamaterials and chemical reaction systems.
- Engineering efforts should focus on addressing scalability, precision, and programmability for each substrate type.
Where Pith is reading between the lines
- Hybrid systems combining multiple substrates might achieve broader performance coverage than any single one.
- The benchmarking approach could be extended to include more dynamic real-world tasks to refine the comparison.
- Developers of edge AI devices may select substrates based on specific application requirements identified in the regimes.
Load-bearing premise
The first-order benchmarking scheme using standardized tasks and performance dimensions provides a fair comparison that does not miss critical platform-specific factors or selection biases.
What would settle it
An experimental result showing that one particular substrate, for example memristive devices, outperforms all others in every standardized task and across all considered performance dimensions would challenge the claim of complementary operating regimes.
Figures
read the original abstract
Physical implementations of neural computation now extend far beyond silicon hardware, encompassing substrates such as memristive devices, photonic circuits, mechanical metamaterials, microfluidic networks, chemical reaction systems, and living neural tissue. By exploiting intrinsic physical processes such as charge transport, wave interference, elastic deformation, mass transport, and biochemical regulation, these substrates can realize neural inference and adaptation directly in matter. As silicon GPU-centered AI faces growing energy and data-movement constraints, physical neural computation is becoming increasingly relevant as a complementary path beyond conventional digital accelerators. This trend is driven in particular by pervasive intelligence, i.e., the deployment of on-device and edge AI across large numbers of resource-constrained systems. In such settings, co-locating computation with sensing and memory can reduce data shuttling and improve efficiency. Meanwhile, physical neural approaches have emerged across disparate disciplines, yet progress remains fragmented, with limited shared terminology and few principled ways to compare platforms. This survey unifies the field by mapping neural primitives to substrate-specific mechanisms, analyzing architectural and training paradigms, and identifying key engineering constraints including scalability, precision, programmability, and I/O interfacing overhead. To enable cross-domain comparison, we introduce a first-order benchmarking scheme based on standardized static and dynamic tasks and physically interpretable performance dimensions. We show that no single substrate dominates across the considered dimensions; instead, physical neural systems occupy complementary operating regimes, enabling applications ranging from ultrafast signal processing and in-memory inference to embodied control and in-sample biochemical decision making.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. This survey unifies physical neural computing across substrates including memristive devices, photonic circuits, mechanical metamaterials, microfluidic networks, chemical reaction systems, and living neural tissue. It maps neural primitives to intrinsic physical mechanisms (charge transport, wave interference, elastic deformation, mass transport, biochemical regulation), reviews architectural and training paradigms, catalogs engineering constraints (scalability, precision, programmability, I/O overhead), and introduces a first-order benchmarking scheme using standardized static and dynamic tasks plus physically interpretable performance dimensions. The central claim is that no single substrate dominates; physical neural systems instead occupy complementary operating regimes suited to applications ranging from ultrafast signal processing and in-memory inference to embodied control and in-sample biochemical decision making.
Significance. If the benchmarking scheme is shown to be robust and representative, the work provides a valuable unifying framework for a currently fragmented field. It offers practical guidance for selecting substrates in pervasive edge-AI settings where co-located sensing, memory, and computation can mitigate data-movement and energy costs, while highlighting application-specific strengths rather than a universal winner.
major comments (1)
- [Benchmarking scheme] Benchmarking scheme (as described in the abstract and used to reach the headline claim): the first-order scheme of standardized static/dynamic tasks and physically interpretable dimensions is load-bearing for the conclusion that substrates occupy complementary regimes. The abstract provides no quantitative evidence that the chosen tasks are representative across all listed substrates, nor any sensitivity analysis showing that omitted factors (long-term drift, fabrication yield, I/O energy, precision stability) do not alter the 'no single substrate dominates' verdict. Without such validation, task-selection or dimension-weighting bias could artificially support the complementary-regimes result.
minor comments (1)
- [Abstract] The abstract is information-dense; splitting the description of the benchmarking scheme and its results into separate sentences would improve readability.
Simulated Author's Rebuttal
We thank the referee for the positive assessment of our survey and for the detailed, constructive feedback on the benchmarking scheme. We address the major comment point by point below and will make targeted revisions to clarify the scope and limitations of our first-order framework.
read point-by-point responses
-
Referee: [Benchmarking scheme] Benchmarking scheme (as described in the abstract and used to reach the headline claim): the first-order scheme of standardized static/dynamic tasks and physically interpretable dimensions is load-bearing for the conclusion that substrates occupy complementary regimes. The abstract provides no quantitative evidence that the chosen tasks are representative across all listed substrates, nor any sensitivity analysis showing that omitted factors (long-term drift, fabrication yield, I/O energy, precision stability) do not alter the 'no single substrate dominates' verdict. Without such validation, task-selection or dimension-weighting bias could artificially support the complementary-regimes result.
Authors: We agree that the benchmarking scheme is central to our headline claim and that the abstract does not supply quantitative evidence or sensitivity analysis. As this is a survey unifying a fragmented literature rather than a primary empirical study, the scheme is explicitly positioned as first-order and conceptual: tasks (static classification and dynamic sequence processing) and dimensions (scalability, precision, programmability, I/O overhead) were selected because they recur across published work on memristive, photonic, mechanical, microfluidic, chemical, and biological substrates. No new cross-substrate experiments were performed. We will revise the abstract, introduction, and benchmarking section to state these qualifications more prominently and to add a dedicated limitations subsection. That subsection will discuss how omitted factors such as long-term drift, fabrication yield, I/O energy, and precision stability could affect the complementary-regimes conclusion under alternative weightings, citing relevant substrate-specific studies. These changes will make the evidential basis and potential biases explicit without altering the survey's scope. revision: partial
Circularity Check
No circularity: survey applies externally sourced literature via introduced scheme
full rationale
This is a survey paper reviewing physical neural substrates drawn from external literature across disciplines. It introduces a first-order benchmarking scheme of standardized static/dynamic tasks and interpretable dimensions, then applies the scheme to the reviewed body of work to reach the conclusion that no single substrate dominates and systems occupy complementary regimes. The scheme is not defined in terms of the conclusion, nor does any derivation reduce by construction to fitted parameters, self-citations, or renamed inputs; the central mapping and comparison rest on external references rather than internal self-reference. No load-bearing self-citation chains, ansatz smuggling, or uniqueness theorems from the authors' prior work appear in the derivation. The paper is therefore self-contained against external benchmarks with independent content.
Axiom & Free-Parameter Ledger
Forward citations
Cited by 2 Pith papers
-
phys-MCP: A Control Plane for Heterogeneous Physical Neural Networks
phys-MCP is a substrate-aware orchestration layer that exposes heterogeneous physical neural networks as invocable resources with standardized capability, lifecycle, telemetry, and digital-twin interfaces.
-
Embedded DNA Inference in In-Body Nanonetworks: Detection, Delay, and Communication Trade-Offs
Simulations identify a bounded regime where embedded DNA inference reporting improves detection of weak-to-moderate anomalies while remaining competitive in communication cost, though it adds delay and does not outper...
Reference graph
Works this paper leans on
-
[1]
The future of computing beyond moore’s law,
J. Shalf, “The future of computing beyond moore’s law,”Philosophical Transactions of the Royal So- ciety A: Mathematical, Physical and Engineering Sciences, vol. 378, no. 2166, p. 20190061, Jan. 2020,issn: 1364-503X.doi: 10.1098/rsta.2019.0061
-
[2]
Training of physical neural networks,
A. Momeni et al., “Training of physical neural networks,”Nature, vol. 645, no. 8079, pp. 53–61, Sep. 2025.doi: 10.1038/s41586-025-09384-2
-
[3]
J. Backus, “Can programming be liberated from the von Neumann style? a functional style and its algebra of programs,”Commun. ACM, vol. 21, no. 8, pp. 613–641, Aug. 1978,issn: 0001-0782.doi: 10.1145/359576.359579
-
[4]
Neuromorphic electronic systems
C. Mead, “Neuromorphic electronic systems,”Pro- ceedings of the IEEE, vol. 78, no. 10, pp. 1629–1636, 1990.doi: 10.1109/5.58356
-
[5]
D. B. Strukov, G. S. Snider, D. R. Stewart, and R. S. Williams, “The missing memristor found,” Nature, vol. 453, no. 7191, pp. 80–83, May 2008, issn: 1476-4687.doi: 10.1038/nature06932
-
[6]
Adamatzky, B
A. Adamatzky, B. D. L. Costello, and T. Asai, Reaction-Diffusion Computers. Oxford: Elsevier Science, 2005,isbn: 9780444520425.doi: 10.1016/ B978-0-444-52042-5.X5000-2
2005
-
[7]
Capacitive in-sensor tactile com- puting,
Y. Chen et al., “Capacitive in-sensor tactile com- puting,”Nat Commun, vol. 16, no. 1, p. 5691, Jul. 2025.doi: 10.1038/s41467-025-60703-7
-
[8]
Y. Hu, J. Zhang, K. Shen, W. Shen, H. K. Lee, and S. Tang, “Intelligent molecular logic comput- ing toolkits: Nucleic acid-based construction, func- tionality, and enhanced biosensing applications,” Chem. Sci., vol. 16, pp. 20139–20180, 43 2025.doi: 10.1039/D5SC06176H
-
[9]
Simulating physics with computers,
R. P. Feynman, “Simulating physics with comput- ers,”International Journal of Theoretical Physics, vol. 21, no. 6, pp. 467–488, Jun. 1982,issn: 1572- 9575.doi: 10.1007/BF02650179
-
[10]
“neural” com- putation of decisions in optimization problems,
J. J. Hopfield and D. W. Tank, ““neural” com- putation of decisions in optimization problems,” Biological Cybernetics, vol. 52, no. 3, pp. 141–152, Jul. 1985.doi: 10.1007/BF00339943
-
[11]
A. Rolandi, P. Abiuso, P. Lipka-Bartosik, M. Aifer, P. J. Coles, and M. Perarnau-Llobet,Energy-time- accuracy tradeoffs in thermodynamic computing, 2026.doi: 10.48550/arXiv.2601.04358
-
[12]
Neural net- work computation with DNA strand displacement cascades,
L. Qian, E. Winfree, and J. Bruck, “Neural net- work computation with DNA strand displacement cascades,”Nature, vol. 475, no. 7356, pp. 368–372, 2011.doi: 10.1038/nature10262
-
[13]
H. Sohr,The Navier-Stokes Equations: An Ele- mentary Functional Analytic Approach(Birkhäuser Advanced Texts Basler Lehrbücher), 1st ed. Basel: Birkhäuser Basel, 2001,isbn: 978-3-0348-8255-2. doi: 10.1007/978-3-0348-8255-2
-
[14]
Toward Realistic AI-Generated Student Questions to Support Instructor Training
X. Wang, H. Borras, B. Klein, and H. Fröning, “Variance-aware noisy training: Hardening DNNs against unstable analog computations,” inMachine Learning and Knowledge Discovery in Databases. Research Track, R. P. Ribeiro et al., Eds., Cham: Springer Nature Switzerland, 2026, pp. 147–163, isbn: 978-3-032-06109-6.doi: 10.1007/978-3-032- 06109-6_9
-
[15]
On computing in fine-grained com- partmentalised Belousov–Zhabotinsky medium,
A. Adamatzky, J. Holley, L. Bull, and B. De Lacy Costello, “On computing in fine-grained com- partmentalised Belousov–Zhabotinsky medium,” Chaos, Solitons, and Fractals, vol. 44, no. 10, 30 pp. 779–790, Oct. 2011,issn: 0960-0779.doi: 10. 1016/j.chaos.2011.03.010
2011
-
[16]
L. G. Wright et al., “Deep physical neural networks trained with backpropagation,”Nature, vol. 601, no. 7894, pp. 549–555, Jan. 2022,issn: 1476-4687. doi: 10.1038/s41586-021-04223-6
-
[17]
Humble, Alexander McCaskey, Dmitry I
M. Davies et al., “Loihi: A neuromorphic many- core processor with on-chip learning,”IEEE Micro, vol. 38, no. 1, pp. 82–99, 2018.doi: 10.1109/MM. 2018.112130359
work page doi:10.1109/mm 2018
-
[18]
Truenorth: Design and tool flow of a 65 mw 1 million neuron programmable neu- rosynaptic chip,
F. Akopyan et al., “Truenorth: Design and tool flow of a 65 mw 1 million neuron programmable neu- rosynaptic chip,”IEEE Transactions on Computer- Aided Design of Integrated Circuits and Systems, vol. 34, no. 10, pp. 1537–1557, 2015.doi: 10.1109/ TCAD.2015.2474396
-
[19]
E. M. Purcell, “Life at low Reynolds number,” American Journal of Physics, vol. 45, no. 1, pp. 3– 11, Jan. 1977,issn: 0002-9505.doi: 10.1119/1. 10903
work page doi:10.1119/1 1977
-
[20]
Wolf, Experimental Studies of Magnetic Tricritic al Points: Problems and Progress
C. Mead,Analog VLSI and neural systems. USA: Addison-Wesley Longman Publishing Co., Inc., 1989,isbn: 0201059924.doi: 10.1007/978-1-4613- 1639-8
-
[21]
DNA as a universal substrate for chemical kinetics,
D. Soloveichik, G. Seelig, and E. Winfree, “DNA as a universal substrate for chemical kinetics,” Proceedings of the National Academy of Sciences, vol. 107, no. 12, pp. 5393–5398, 2010.doi: 10.1073/ pnas.0909380107
2010
-
[22]
Chemical implementation of neural networks and Turing machines.,
A. Hjelmfelt, E. D. Weinberger, and J. Ross, “Chemical implementation of neural networks and Turing machines.,”Proceedings of the National Academy of Sciences, vol. 88, no. 24, pp. 10983– 10987, 1991.doi: 10.1073/pnas.88.24.10983
-
[23]
Pro- grammable mechanical metamaterials,
B. Florijn, C. Coulais, and M. van Hecke, “Pro- grammable mechanical metamaterials,”Phys. Rev. Lett., vol. 113, p. 175503, 17 Oct. 2014.doi: 10. 1103/PhysRevLett.113.175503
2014
-
[24]
Molecular computation of solu- tions to combinatorial problems,
L. M. Adleman, “Molecular computation of solu- tions to combinatorial problems,”Science, vol. 266, no. 5187, pp. 1021–1024, Nov. 1994.doi: 10.1126/ science.7973651
1994
-
[25]
S. Kirkpatrick, C. D. Gelatt Jr, and M. P. Vecchi, “Optimization by simulated annealing,”Science, vol. 220, no. 4598, pp. 671–680, May 1983.doi: 10.1126/science.220.4598.671
-
[26]
Wave physics as an analog recurrent neural network,
T. W. Hughes, I. A. D. Williamson, M. Minkov, and S. Fan, “Wave physics as an analog recurrent neural network,”Sci Adv, vol. 5, no. 12, eaay6946, Dec. 2019.doi: 10.1126/sciadv.aay6946
-
[27]
Pattern recognition in a bucket,
C. Fernando and S. Sojakka, “Pattern recognition in a bucket,” inAdvances in Artificial Life, W. Banzhaf, J. Ziegler, T. Christaller, P. Dittrich, and J. T. Kim, Eds., Berlin, Heidelberg: Springer Berlin Heidelberg, 2003, pp. 588–597,isbn: 978-3-540- 39432-7.doi: 10.1007/978-3-540-39432-7_63
-
[28]
Information processing via physical soft body,
K. Nakajima, H. Hauser, T. Li, and R. Pfeifer, “Information processing via physical soft body,” ScientificReports, vol. 5, no. 1, p. 10487, May 2015, issn: 2045-2322.doi: 10.1038/srep10487
-
[29]
Experimental demonstration of reservoir computing on a silicon photonics chip,
K. Vandoorne et al., “Experimental demonstration of reservoir computing on a silicon photonics chip,” Nature Communications, vol. 5, no. 1, p. 3541, Mar. 2014,issn: 2041-1723.doi: 10.1038/ncomms4541
-
[30]
Recent advances in physical reservoir computing: A review,
G. Tanaka et al., “Recent advances in physical reservoir computing: A review,”Neural Networks, vol. 115, pp. 100–123, 2019,issn: 0893-6080.doi: https://doi.org/10.1016/j.neunet.2019.03.005
-
[31]
Equilibrium propaga- tion: Bridging the gap between energy-based mod- els and backpropagation,
B. Scellier and Y. Bengio, “Equilibrium propaga- tion: Bridging the gap between energy-based mod- els and backpropagation,”Front Comput Neurosci, vol. 11, p. 24, May 2017.doi: 10.3389/fncom.2017. 00024
-
[32]
Supervised learning in physical networks: From machine learning to learning machines,
M. Stern, D. Hexner, J. W. Rocks, and A. J. Liu, “Supervised learning in physical networks: From machine learning to learning machines,”Phys. Rev. X, vol. 11, p. 021045, 2 May 2021.doi: 10.1103/ PhysRevX.11.021045
2021
-
[33]
Demonstration of decentralized physics-driven learning,
S. Dillavou, M. Stern, A. J. Liu, and D. J. Durian, “Demonstration of decentralized physics-driven learning,”Phys. Rev. Appl., vol. 18, p. 014040, 1 Jul. 2022.doi: 10.1103/PhysRevApplied.18.014040
-
[34]
J. J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons.,”Proceedings of the National AcademyofSciences, vol. 81, no. 10, pp. 3088–3092, 1984.doi: 10.1073/pnas.81.10.3088
-
[35]
The “echo state
H. Jaeger, “The “echo state” approach to analysing and training recurrent neural networks,”GMD- Report 148, German National Research Institute for Computer Science, Jan. 2001.doi: 10.24406/ publica-fhg-291111
2001
-
[36]
Nucleic acid strand displacement - from DNA nanotechnology to translational regula- tion,
F. C. Simmel, “Nucleic acid strand displacement - from DNA nanotechnology to translational regula- tion,”RNA Biol, vol. 20, no. 1, pp. 154–163, Jan. 2023.doi: 10.1080/15476286.2023.2204565
-
[37]
DNA strand displacement based computa- tionalsystemsandtheirapplications,
C. Chen, J. Wen, Z. Wen, S. Song, and X. Shi, “DNA strand displacement based computa- tionalsystemsandtheirapplications,”FrontGenet, vol. 14, p. 1120791, Feb. 2023.doi: 10.3389/fgene. 2023.1120791
-
[38]
Backpropagation in molecular neural networks: Teaching DNA to solve machine learning tasks,
A. A. Zagitov et al., “Backpropagation in molecular neural networks: Teaching DNA to solve machine learning tasks,”Preprints, Jan. 2026.doi: 10 . 20944/preprints202601.0088.v1
-
[39]
Folding DNA to create nanoscale shapes and patterns,
P. W. K. Rothemund, “Folding DNA to create nanoscale shapes and patterns,”Nature, vol. 440, no. 7082, pp. 297–302, 2006.doi: 10 . 1038 / nature04586
2006
-
[40]
Recent advances in DNA origami- engineered nanomaterials and applications,
P. Zhan et al., “Recent advances in DNA origami- engineered nanomaterials and applications,”Chem- ical Reviews, vol. 123, no. 7, pp. 3976–4050, 2023. doi: 10.1021/acs.chemrev.3c00028
-
[41]
Design of a universal decoder model based on DNA winner-takes-all neural net- works,
C. Huang et al., “Design of a universal decoder model based on DNA winner-takes-all neural net- works,”IEEE Transactions on Computers, vol. 74, 31 no. 4, pp. 1267–1277, 2025.doi: 10.1109/TC.2024. 3521230
-
[42]
Scaling up molecular pattern recognition with DNA-based winner-take- all neural networks,
K. M. Cherry and L. Qian, “Scaling up molecular pattern recognition with DNA-based winner-take- all neural networks,”Nature, vol. 559, no. 7714, pp. 370–376, 2018.doi: 10.1038/s41586-018-0289-6
-
[43]
L. Qian and E. Winfree, “Scaling up digital circuit computation with DNA strand displacement cas- cades,”Science, vol. 332, no. 6034, pp. 1196–1201, 2011.doi: 10.1126/science.1200520
-
[44]
Deep convolutional and fully- connected DNA neural networks,
X. Liu et al., “Deep convolutional and fully- connected DNA neural networks,”Nat Commun, vol. 16, no. 1, p. 10629, Nov. 2025.doi: 10.1038/ s41467-025-65618-x
2025
-
[45]
Proceedings of the National Academy of Sci- ences79(8), 2554–2558 (Apr 1982)
J. J. Hopfield, “Neural networks and physical sys- tems with emergent collective computational abil- ities.,”Proceedings of the National Academy of Sciences, vol. 79, no. 8, pp. 2554–2558, 1982.doi: 10.1073/pnas.79.8.2554
-
[46]
J. Sun, H. Wang, Y. Yue, D. Ling, and Y. Wang, “Design of Hopfield neural network based on DNA strand displacement circuits and its application in sudoku conjecture,”IEEE Transactions on Neural Networks and Learning Systems, vol. 36, no. 10, pp. 18889–18899, 2025.doi: 10.1109/TNNLS. 2025.3576888
-
[47]
Photocleavable Ortho- Nitrobenzyl-Protected DNA architectures and their applications,
M. P. O’Hagan et al., “Photocleavable Ortho- Nitrobenzyl-Protected DNA architectures and their applications,”Chem. Rev., vol. 123, no. 10, pp. 6839–6887, May 2023.doi: 10 . 1021 / acs . chemrev.3c00016
2023
-
[48]
Supervised learning in DNA neural networks,
K. M. Cherry and L. Qian, “Supervised learning in DNA neural networks,”Nature, 2025.doi: 10. 1038/s41586-025-09479-w
2025
-
[49]
M. Sanjabi and A. Jahanian, “Multi-threshold and multi-input DNA logic design style for profiling the microrna biomarkers of real cancers,”IET Nanobiotechnology, vol. 13, no. 7, pp. 665–673, 2019.doi: https://doi.org/10.1049/iet-nbt.2018. 5275
-
[50]
Nonlinear decision-making with enzymatic neural networks,
S. Okumura et al., “Nonlinear decision-making with enzymatic neural networks,”Nature, vol. 610, no. 7932, pp. 496–501, Oct. 2022.doi: 10.1038/ s41586-022-05218-7
2022
-
[51]
S. Takiguchi et al., “Harnessing DNA computing and nanopore decoding for practical applications: From informatics to microRNA-targeting diagnos- tics,”Chem. Soc. Rev., vol. 54, pp. 8–32, 1 2025. doi: 10.1039/D3CS00396E
-
[52]
Synthetic neuromorphic computing in livingcells,
L. Rizik, L. Danial, M. Habib, R. Weiss, and R. Daniel, “Synthetic neuromorphic computing in livingcells,”NatureCommunications,vol.13,no.1, p. 5602, Sep. 2022.doi: 10.1038/s41467-022-33288- 8
-
[53]
CRISPR-based neuromor- phic computing for solving regression and classifi- cation,
C. M. Leon, Y. Wang, F. B. Bisso, A. Mehta, and C. C. Samaniego, “CRISPR-based neuromor- phic computing for solving regression and classifi- cation,”bioRxiv, 2025.doi: 10.64898/2025.12.03. 692209
-
[54]
Journal of the Ameri- can Chemical Society144(15), 6625–6639 (2022) https://doi.org/10.1021/JACS
C. Sun, X. Liu, J. Zhong, Q. Zhou, and J. Cheng, “Reusable noncomplementary DNA-based neural network,”J Am Chem Soc, vol. 147, no. 38, pp. 34339–34349, Aug. 2025.doi: 10.1021/jacs. 5c04886
-
[55]
Heat-rechargeable compu- tation in DNA logic circuits and neural networks,
T. Song and L. Qian, “Heat-rechargeable compu- tation in DNA logic circuits and neural networks,” Nature, vol. 646, pp. 315–322, 2025.doi: 10.1038/ s41586-025-09570-2
2025
-
[56]
Towards on-chip chemical computing: The oscillating Belousov- Zhabotinsky reaction in silicon microchips,
S. Agostini, V. Zarth, et al., “Towards on-chip chemical computing: The oscillating Belousov- Zhabotinsky reaction in silicon microchips,” inSCS Fall Meeting 2025 Abstract Catalogue, Poster PC- 101, IBM Research, SCS, 2025
2025
-
[57]
Chemical reservoir computation in a self-organizing reaction network,
M. G. Baltussen, T. J. de Jong, Q. Duez, W. E. Robinson, and W. T. S. Huck, “Chemical reservoir computation in a self-organizing reaction network,” Nature, vol. 631, no. 8021, pp. 549–555, Jul. 2024. doi: 10.1038/s41586-024-07567-x
-
[58]
Autonomous soft robots: Self-regulation, self- sustained, and recovery strategies,
C. Zhu, B.-Y. Liu, L.-Z. Zhang, and L. Xu, “Autonomous soft robots: Self-regulation, self- sustained, and recovery strategies,”Chinese Jour- nal of Polymer Science, vol. 43, no. 4, pp. 535–547, Apr. 2025.doi: 10.1007/s10118-025-3284-z
-
[59]
X. Li and M. Lu, “Study on the anti-interference characteristics of neuronal networks: A compar- ative study of chemical synapses and electri- cal synapse,”Frontiers in Neuroscience, vol. 19, p. 1581347, 2025.doi: 10.3389/fnins.2025.1581347
-
[60]
Logic gates in excitable media,
A. Toth and K. Showalter, “Logic gates in excitable media,”Journal of Chemical Physics, vol. 103, pp. 2058–2066, 1995.doi: 10.1063/1.469732
-
[61]
Reservoir computing with bio- compatible organic electrochemical networks for brain-inspired biosignal classification,
M. Cucchi et al., “Reservoir computing with bio- compatible organic electrochemical networks for brain-inspired biosignal classification,”Science Ad- vances,vol.7,no.34,eabh0693,2021.doi:10.1126/ sciadv.abh0693
2021
-
[62]
A tissue- like printed material,
G. Villar, A. D. Graham, and H. Bayley, “A tissue- like printed material,”Science, vol. 340, no. 6128, pp. 48–52, Apr. 2013.doi: 10.1126/science.1229495
-
[63]
M. U. Gaimann and M. Klopotek,Robustly optimal dynamics for active matter reservoir computing, 2025.doi: 10.48550/arXiv.2505.05420
-
[64]
Neural Ordinary Differential Equations
R. T. Q. Chen, Y. Rubanova, J. Bettencourt, and D. Duvenaud, “Neural ordinary differential equa- tions,”arXiv preprint arXiv:1806.07366, 2018.doi: 10.48550/arXiv.1806.07366
work page internal anchor Pith review doi:10.48550/arxiv.1806.07366 2018
-
[65]
SPIN-ODE: Stiff physics-informed neural ODE for chemical reaction rate estimation,
W. Peng, Z.-S. Liu, and M. Boy, “SPIN-ODE: Stiff physics-informed neural ODE for chemical reaction rate estimation,” inECAI 2025 – 28th European Conference on Artificial Intelligence, in- cluding14thConferenceonPrestigiousApplications of Intelligent Systems (PAIS 2025) – Proceedings, I. Lynce et al., Eds., ser. Frontiers in Artificial Intelligence and App...
-
[66]
M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Physics-informed neural networks: A deep learn- ing framework for solving forward and inverse prob- lems involving nonlinear partial differential equa- tions,”Journal of Computational Physics, vol. 378, pp. 686–707, 2019.doi: 10.1016/j.jcp.2018.10.045
-
[67]
A programmable chemical computer with memory and pattern recognition,
J. M. Parrilla-Gutierrez et al., “A programmable chemical computer with memory and pattern recognition,”Nat. Commun., vol. 11, no. 1, p. 1442, Mar. 2020.doi: 10.1038/s41467-020-15190-3
-
[68]
Navigat- ing complex labyrinths: Optimal paths from chemi- cal waves,
O. Steinbock, A. Tóth, and K. Showalter, “Navigat- ing complex labyrinths: Optimal paths from chemi- cal waves,”Science, vol. 267, no. 5199, pp. 868–871, Feb. 1995.doi: 10.1126/science.267.5199.868
-
[69]
Maze solving by chemotactic droplets,
I. Lagzi, S. Soh, P. J. Wesson, K. P. Browne, and B. A. Grzybowski, “Maze solving by chemotactic droplets,”J. Am. Chem. Soc., vol. 132, no. 4, pp. 1198–1199, Feb. 2010.doi: 10.1021/ja9076793
-
[70]
Organoid intelligence (OI) - the ultimate functionality of a brain microphysiological system,
L. Smirnova, I. E. Morales Pantoja, and T. Har- tung, “Organoid intelligence (OI) - the ultimate functionality of a brain microphysiological system,” ALTEX, vol. 40, no. 2, pp. 191–203, 2023.doi: 10. 14573/altex.2303261
2023
-
[71]
A synthetic oscilla- tory network of transcriptional regulators,
M. Elowitz and S. Leibler, “A synthetic oscilla- tory network of transcriptional regulators,”Nature, vol. 403, pp. 335–338, 2000.doi: 10.1038/35002125
-
[72]
A. Tamsir, J. J. Tabor, and C. A. Voigt, “Robust multicellular computing using genetically encoded NOR gates and chemical wires,”Nature, vol. 469, pp. 212–215, 2011.doi: 10.1038/nature09565
-
[73]
A synchronized quorum of genetic clocks,
T. Danino, O. Mondragón-Palomino, L. Tsimring, and J. Hasty, “A synchronized quorum of genetic clocks,”Nature, vol. 463, pp. 326–330, 2010.doi: 10.1038/nature08753
-
[74]
A. Tero et al., “Rules for biologically inspired adap- tive network design,”Science, vol. 327, no. 5964, pp. 439–442, Jan. 2010.doi: 10.1126/science. 117789
-
[75]
A. Adamatzky,Physarum Machines. World Scien- tific, 2010.doi: 10.1142/7968
-
[76]
Brain organoid reservoir computing forartificialintelligence,
H. Cai et al., “Brain organoid reservoir computing forartificialintelligence,”NatureElectronics,vol.6, no. 12, pp. 1032–1039, Dec. 2023.doi: 10.1038/ s41928-023-01069-w
2023
-
[77]
Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynap- tic cell type,
G.-Q. Bi and M.-M. Poo, “Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynap- tic cell type,”Journal of Neuroscience, vol. 18, pp. 10464–10472, 1998.doi: https : / / www . jneurosci.org/content/18/24/10464
1998
-
[78]
Synthetic analog computation in living cells,
R. Daniel, J. Rubens, R. Sarpeshkar, and T. K. Lu, “Synthetic analog computation in living cells,” Nature, vol. 497, pp. 619–623, 2013.doi: 10.1038/ nature12148
2013
-
[79]
K. Friston, “The free-energy principle: A unified brain theory?”Nat Rev Neurosci, vol. 11, no. 2, pp. 127–138, Jan. 2010.doi: 10.1038/nrn2787
-
[80]
M. Obien, K. Deligkaris, T. Bullmann, D. Bakkum, and U. Frey, “Revealing neuronal function through microelectrode array recordings,”Frontiers in Neu- roscience, vol. 9, p. 423, 2015.doi: 10.3389/fnins. 2014.00423
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.