pith. machine review for the scientific record. sign in

physics.hist-ph

History and Philosophy of Physics

History and philosophy of all branches of physics, astrophysics, and cosmology, including appreciations of physicists.

0
physics.hist-ph 2026-05-13

1923 Newtonian estimate gave 49 hours to the Moon

On the Anticipation of Lunar Travel in the Early 20th Century: A Pedagogical Exercise

Basic gravity arguments produced a travel time within the same order as Apollo's 72-hour missions and outlined the main trajectory phases.

Figure from the paper full image
abstract click to expand
This article examines, from historical and pedagogical perspectives, Alphonse Berget's anticipation of Earth-Moon travel in Le Ciel (Larousse, 1923), decades before the beginning of the space age. The discussion is triggered by Le Ciel, a richly illustrated French popular science work, which has a devoted chapter examining lunar and interplanetary travel within a Newtonian framework. Although Berget's treatment was not developed in isolation and reflects a broader early 20th century context that included pioneers such as French aero-engineer Robert Esnault-Pelterie, the book provides a striking pedagogical synthesis of elementary celestial mechanics and scientific popularization. Unlike earlier fictional treatments such as Jules Verne's De la Terre a la Lune, Berget approached space travel using physical reasoning grounded in Newtonian gravitation. Using qualitative and semi-quantitative arguments based on the inverse-square law, he identified the principal phases of an Earth-Moon trajectory: escape from Earth, inertial translunar motion, transition through competing Earth-Moon gravitational fields, and final lunar capture and deceleration. His estimated Earth-Moon travel time of approximately 49 hours is of the same order of magnitude as Apollo mission transit times (approx. 72 h). We compare these early ideas with modern elementary concepts of astrodynamics, including restricted three-body trajectories, Lagrange-point dynamics, and distant retrograde orbits associated with the Artemis program. We also examine Berget's discussion of interplanetary travel, lunar landscapes, and human factors associated with prolonged voyages, including confinement, food supply, and travel duration. The analysis highlights the pedagogical value of historically grounded scientific reasoning underpinning spaceflight mechanics.
0
0
physics.hist-ph 2026-05-13 2 theorems

Causality split sharpens choice of math and stats methods

Causality and Scientific Inquiry: Lessons from Space Physics and Medical Sciences

Space physics and medicine examples show why distinguishing mechanistic from difference-making views matters for reliable results.

abstract click to expand
Over the past two decades, the rapid surge in data-intensive computational techniques for statistical modeling may have had the effect of diminishing the use of applied mathematics in causal scientific inquiry. In this paper, co-authored by an astrophysicist, a mathematician, and philosophers, we assess the hazards of neglecting the branch of mathematics that constructs models to address causal questions in favor of statistical modeling alone. Causality is relevant in all branches of science and is often elucidated through applied mathematics. Here, we illuminate the idea with examples drawn from space physics and medical sciences. We examine causal questions to demonstrate how applied mathematical and statistical methods may differentiate between two fundamental facets of causality, i.e., mechanistic and difference-making. Understanding such foundational differences in causality may, in some cases, help explain discrepant or erroneous research results. Most importantly, understanding the relationship between causality and analytical approaches used in science has the potential to strengthen the rigor and reliability of scientific inquiry through optimal selection of mathematical and/or statistical methods.
0
0
physics.hist-ph 2026-05-11 2 theorems

Physics rewards track transformations of viable theory space

Polydoxon Transformations and Scientific Reward in Physics

Major prizes and honors recognize expansions, contractions, reconfigurations or enabling moves in the set of empirically viable theories.

abstract click to expand
We develop a descriptive account of scientific reward in physics based on the concept of the time-dependent Polydoxon, defined as the structured set of empirically viable theories at a given time. We argue that highly rewarded contributions, such as those recognized by major prizes and professional honors, can be systematically understood as those that transform this space. These transformations take the form of expansion (adding viable theories), contraction (eliminating viable theories), reconfiguration (illuminating deeper structures and relations within and between theories), and enabling moves (methodological or technological advances that enable future transformations). The analysis is further refined by emphasizing that reward correlates with the transformation's magnitude, assessed along dimensions of scope, centrality, depth, and future leverage. This framework reframes the analysis of rewarded achievement away from isolated theoretical successes and toward the dynamics of a landscape of viable theories, providing a more unified descriptive interpretation of rewarded scientific activity in physics across its diverse set of theoretical and experimental discoveries.
0
0
physics.hist-ph 2026-05-06 1 theorem

Habicht 1914 notes tie relativity to Lorentz's moving-body problems

Conrad Habicht 1914 Manuscript on Special Relativity and Einstein 1907 Reframing of the 1905 Theory

The manuscript spends space on ether, Fizeau, Michelson-Morley and local time before reaching Einstein's postulates.

abstract click to expand
This note examines an apparently unpublished manuscript on special relativity written by Conrad Habicht in 1914 and made available online by the ETH-Bibliothek Z\"urich in December 2024. To the best of my knowledge, no study of its content has yet been published. Habicht was one of Einstein's closest companions during the Bern years. Between February 1902 and mid-1904 he shared with Einstein many occasions for discussion and companionship in Bern. After leaving the city, he remained in close contact with Einstein through visits, reciprocal stays, and a substantial correspondence extending from the years immediately following 1905 to the eve of the First World War. The manuscript offers a clear and pedagogical presentation of special relativity. Its historical interest lies in the structure of the exposition and in the memory of the theory that the text preserves. Habicht does not present special relativity as an isolated creation beginning from Einstein's 1905 paper alone. He devotes considerable space to the pre-Einsteinian problem situation: the classical principle of relativity, the ether, Fizeau's experiment, Michelson--Morley, Lorentz's theory, the contraction hypothesis, local time, and the privileged system of the stationary ether. Lorentz is treated as the central figure who brought the electrodynamics of moving bodies to its most acute form before Einstein's intervention. This note provides a qualitative description of the manuscript, with particular attention to its structure, its treatment of the relation between classical mechanics and electrodynamics, and the respective roles assigned to Lorentz, Michelson--Morley, Einstein, and Minkowski.
0
0
physics.hist-ph 2026-05-01

Lorentz contraction is the only shape preserving cavity resonances

Lorentz-FitzGerald Contraction as the Unique Closure Condition for Moving Spherical-Harmonic Cavities

Phase must stay direction-independent inside a moving wave cavity, forcing the aspect ratio 1/γ and period stretch γ T₀.

abstract click to expand
We prove that the Lorentz--FitzGerald contraction is the unique deformation of a resonant cavity moving through a mechanical wave medium that preserves spherical-harmonic phase closure. For a cavity moving at speed $v = \beta c$ through a medium supporting nondispersive wave propagation at speed $c$, the round-trip phase of an internal ray at angle $\theta$ to the motion depends on the boundary radius $r(\theta)$ according to $\Phi(\theta) = 2k\,r(\theta)\sqrt{1-\beta^2\sin^2\theta}/(1-\beta^2)$. Requiring $\Phi(\theta)$ to be independent of $\theta$ -- the necessary condition for retaining a spherical-harmonic eigenstructure -- uniquely fixes the Lorentzian aspect ratio \[ \frac{a_\parallel}{a_\perp} = \frac{1}{\gamma} = \sqrt{1-\beta^2}. \] Substituting this unique boundary into the round-trip time yields the resonant period dilation $T = \gamma T_0$, without additional assumptions. Both results -- contraction and dilation -- follow from a single mechanical constraint: preservation of eigenstructure under motion. This is the missing uniqueness theorem of the constructive relativity program initiated by FitzGerald, Lorentz, and Heaviside: the proof that Lorentzian kinematics are not merely consistent with, but uniquely required by, phase closure in a mechanical wave medium.
0
0
physics.hist-ph 2026-04-30

Dingle's 1972 book claimed relativity is inconsistent

On Dingle's rebuttal of the special theory of relativity

The argument turned on symmetric clock slowing and still appears in critiques today, but rests on a misstep in comparing times across frames

Figure from the paper full image
abstract click to expand
In his 1972 book Science At the Crossroads, Helbert Dingle attacked the consistency of special relativity through a fallacious argument championed by the crank community even to this day. Dingle's affair is a curious chapter in the history of physics and, more generally, science. We briefly review Dingle's case from a historical and didactic perspective.
0
0
physics.hist-ph 2026-04-28

Fermi's Varenna lectures trace path to quantum computers

The Legacy of Enrico Fermi to Varenna

His advocacy for building computers instead of buying them prefigures today's quantum science and technologies.

abstract click to expand
The Varenna school is a hub where generations of physicists, including numerous Nobel laureates, have shaped the field, often through collaborative exchanges across political and cultural boundaries. We examine the scientific legacy of Enrico Fermi and its influence on modern atomic, molecular, and optical physics. Beginning with Fermi's 1954 lectures at the Varenna school, key developments are traced from high-energy physics to laser spectroscopy, precision metrology, and the control of ultracold atoms. Milestones such as Doppler-free spectroscopy, optical frequency combs, Bose-Einstein condensation, and degenerate Fermi gases are highlighted as turning points leading to quantum simulation and quantum computation. Fermi's early advocacy for building a computer, rather than buying it, can be viewed as a precursor to today's efforts in quantum science and technologies. This historical trajectory and legacy continues to inform current research in quantum matter and information science.
0
0
physics.hist-ph 2026-04-28

Neopositivism yields Clausius inequality from information definition

The principles of neopositivism and the laws of thermodynamics

A definition of information as non-redundant truth from observation produces the second law without empirical input or extra postulates.

abstract click to expand
The second law of thermodynamics, which deals with irreversibility and makes the theory so special, is usually considered empirical. The definition of equilibrium as an attractor, on the other hand, requires a postulate. This article shows that both are actually already contained, even if hidden, in the fundamental principles of neopositivism, which are widely accepted in all fields of science. In particular, from the definition of information as a truth that can only come from an observation but cannot be redundant, we obtain Clausius' inequality.
0
0
physics.hist-ph 2026-04-24

Reproducible experiments compute definite physical functions

Experiments, Computability, and the Existence of Physical Functions

Fixed protocols turn lab procedures into algorithmic maps from inputs to outputs, with finite precision handled by approximations.

abstract click to expand
Experimental science usually relies on laboratory procedures that, after finitely many steps, terminate with numerical reports on physical quantities. This paper argues that such procedures can be understood as algorithmic once the protocol, background conditions, and reporting rules are fixed. Assuming an explicit physical Church--Turing bridge principle, a reproducible experiment therefore computes a map from admissible inputs to outputs, and the corresponding function exists in the sense appropriate to those outputs. Furthermore, computable analysis allows us to explain why this conclusion is compatible with finite-precision measurement since in this case what matters is a systematic approximation to a requested accuracy, not the production of exact real numbers in a single step. Neither protocol dependence nor stochasticity undermines the existence claim. Rather, they specify which map is realized by a given protocol and what additional assumptions are required for stronger claims about a single protocol-independent quantity. The paper therefore separates three questions that are often conflated: whether the function exists, whether it is computable, and when results obtained under different protocols may be treated as measurements of the same quantity.
0
0
physics.hist-ph 2026-04-21

Einstein Bose gas specific heat formula differs from 2004 version

Comment on "Specific heat of an ideal Bose gas above the Bose condensation temperature," [Am. J. Phys. 72(9), 1193--1194 (2004)]

Recalculation fixes translation errors in the 1925 work and reveals discrepancies with a later published expression.

Figure from the paper full image
abstract click to expand
We examine the English translation of Albert Einstein's groundbreaking 1925 paper on Bose-Einstein condensation. We guide readers to execute the calculations Einstein outlined for the specific heat above the condensation temperature, correct some numerical errors, and compare his formula with a different one published in the American Journal of Physics in 2004. The history of the acceptance of Einstein's theory will be summarized.
0
0
physics.hist-ph 2026-04-21

Bayesian statistics justifies naturalness without aleatoric uncertainty

It's all in your head -- fine-tuning arguments do not require aleatoric uncertainty

An automatic Occam's razor penalizes fine-tuned models in standard Bayesian calculations.

Figure from the paper full image
abstract click to expand
Prompted by misconceptions in the recent literature, we review the justifications for naturalness arguments and Occam's razor found in Bayesian statistics. We discuss the automatic Occam's razor that emerges in Bayesian formalism, bringing together points of view from diverse fields, including statistics, social sciences, physics and machine learning. In pedagogical calculations, we demonstrate that this automatic razor disfavors unnatural models in which predictions must be fine-tuned to agree with observation.
0
0
physics.hist-ph 2026-04-17

Spacetime emerges from asymmetric projection of non-orientable pre-geometry

The Metric Fossil: Emergent Spacetime from Asymmetric Projection

Time, matter, and gravity then appear as direct consequences of projection asymmetry and density rather than independent fundamentals

abstract click to expand
This paper develops a conditional framework for understanding the emergence of measurable physical structure from a pre-metric domain. Contemporary physics provides powerful and precise descriptions of relations among already-defined observables, yet offers comparatively little on the prior question of how observability, separability, and metric structure themselves arise. I propose that if three-dimensional spacetime is the result of an asymmetric projection from a non-orientable pre-geometric regime grounded in a minimal invariant, then a determinate and internally constrained set of consequences follows. These include: time reinterpreted as projection asymmetry rather than as a dimension or entropy gradient; matter as stabilised residue of projection rather than ontological primitive; quantum correlation as pre-separable unity dissolved by non-orientable topology; black holes as regimes of projection saturation rather than information sinks; dark matter as structured lag in the projection process rather than undetected particle species; and gravity as metric tension at sites of high projection density. The framework does not claim empirical confirmation. Its claim is that the proposal is internally coherent, structurally constrained, capable of generating non-trivial research directions, and that several phenomena currently treated as anomalous or paradoxical become expected consequences of the architecture rather than problems requiring additional postulates. An annex presents candidate formal objects and identifies research obligations for each consequence.
0
0
physics.hist-ph 2026-04-15

Calling it 'neutrinoless' hides matter creation physics

Defining Absence: The Origin of "Neutrinoless" and How it Obscures the Physics of Matter Creation

The term originated in 1953 and now defines experiments by absence, potentially distancing them from the search for new laws about matter.

Figure from the paper full image
abstract click to expand
The term 'neutrinoless' is a cornerstone of modern particle physics, yet it defines a fundamental process by what is missing rather than what is created. We trace the origins of this privative neologism to a 1953 experimental claim and show how a 'sociology of suspicion' transformed Ettore Majorana's affirmative ontology into an agnostic shorthand. By examining this linguistic shift, we argue that our current terminology may obscure the profound physical meaning of the search. Reclaiming the language of 'matter creation' is not merely a semantic choice, but a timely conceptual shift to bridge the gap between experimental caution and the radical character of the laws of nature we aim to uncover.
1 0
0
physics.hist-ph 2026-04-13

Vespucci's southern star sightings yield coherent identifications

Amerigo Vespucci and the discovery of the Southern Sky

Following his descriptions exactly produces probable matches and explains 16th-century map confusion.

Figure from the paper full image
abstract click to expand
During the voyages that led him to discover the new continent bearing his name, Amerigo Vespucci made interesting astronomical observations of the southern sky. In the past, his data have been interpreted with criteria that do not follow Vespucci's indications, resulting in identifications that are not credible or even leading to the assertion that the data themselves are incomprehensible. However, it is possible to construct a coherent picture of all the information, arriving at an identification that is in some cases very probable, in other cases almost certain, of the stars described by Vespucci. Analysis of documents shows that he made good-quality measurements, but his incomplete knowledge of ancient texts prevented him from distinguishing the new stars from the already known ones, giving rise to a period of confusion in 16th century celestial cartography.
0
0
physics.hist-ph 2026-04-07 Recognition

Physical causal closure lives only in the Causal Stance

Causal Stance

Separating it from physical determinism allows materialists to accept mental causation without contradiction.

abstract click to expand
What is the meaning of physical causal closure? Jaegwon Kim explicitly adopts a conception of causation according to which physical causation is effectively identified with deterministic physical lawfulness, and equates it with physical determinism. While this conception is internally coherent, it differs from currently dominant theories of causation. Physics and the theory of causation serve different descriptive purposes. In this study, we refer to them, respectively, as the Physical Stance and the Causal Stance. Within this framework, physical determinism belongs to the Physical Stance, and physical causal closure is defined only within the Causal Stance. Consequently, the two should not be equated. On this basis, this study reconstructs Davidson's anomalous monism as a materialist position that acknowledges mental causation without contradicting physical determinism. Furthermore, we propose a linguistic framework in which physical causal closure does not hold in the Causal Stance while physical determinism remains intact in the Physical Stance.
0
0
physics.hist-ph 2026-04-06 2 theorems

Life's fine-tuning overdetermined

Unreasonable Effectiveness of Physics in Biology

The joint a priori probability of all constraints is extremely low, especially in chemistry, extending Wigner's idea on physics laws.

Figure from the paper full image
abstract click to expand
We demonstrate that the system of fine-tuning constraints for life is, in a sense, overdetermined: the a priori probability of its feasibility is extremely low, especially in the chemical sector. This entails that the structure of the physical laws is even more "unreasonable" than Eugene Wigner envisaged.
0
0
physics.hist-ph 2026-03-25 Recognition

Poincaré lecture reached print via three channels in 1904-1905

Henri Poincare Saint Louis Lecture of 1904: Early Publication and International Dissemination

First in a French review, then a math bulletin, and an English translation by January 1905.

abstract click to expand
Henri Poincare Saint Louis lecture, delivered on 24 September 1904 at the International Congress of Arts and Science, occupies a distinctive place in the pre history of twentieth century theoretical physics. In this text, Poincare formulated the principle of relativity in explicit and general terms, not as a narrow empirical rule limited to electrodynamics, but as one of the major guiding principles of mathematical physics. The lecture also offered a principle based conception of theory centered on invariance, least action, and general theoretical coherence. Although the conceptual importance of the Saint Louis lecture has long been recognized in the historiography of relativity, far less attention has been devoted to the material conditions under which it entered international circulation. This article examines the editorial, commercial, and institutional pathways through which the lecture was disseminated between late 1904 and early 1905. It reconstructs the three principal early publication channels of the text: its first printed appearance in La Revue des idees in November 1904, which inserted it into a commercially organized and interdisciplinary intellectual review; its republication in the Bulletin des sciences mathematiques in December 1904, which brought it into a widely distributed specialized mathematical network and later provided the standard reference most often used by historians; and its English translation in The Monist in January 1905, which extended its reach into a transatlantic forum devoted to philosophy, science, and the foundations of knowledge.
0
0
physics.hist-ph 2026-03-18 3 theorems

Projection turns omission into the route to scientific invariants

Projection and Invariance in Scientific Explanation

Mapping systems to equivalence classes reveals patterns that complete descriptions would hide and explains why approximations persist.

abstract click to expand
Any representational enterprise must omit variation in order to function. NASA still uses Newtonian mechanics, though Einstein superseded Newton, and the standard picture of scientific progress cannot explain how. A description that omitted nothing would be identical to its subject and would explain nothing. This paper argues that omission is not a defect but the central structural feature of any enterprise that builds representations from incomplete information. The key concept is projection: a principled mapping from underlying complexity to a descriptive space that partitions states into equivalence classes, omits within-class variation, and makes patterns visible that would otherwise be lost. Projection is simultaneously revelatory and constitutive: it makes genuine invariants tractably accessible while bringing into being the concepts through which they become expressible. The paper distinguishes vertical cases, in which earlier projections survive as limiting cases of more refined successors with recoverable omission, from horizontal cases, in which omission is constitutive, and invariants are accessible only at the level of the projection that defines them. The framework accounts for persistent pluralism in mature sciences, treats the renormalization group as a systematic implementation of the invariant-tracking criterion, and defends a level-relative realism on which higher-level projections reveal genuine structural features of the world. The deepest claim is an inversion of the standard picture: perspectival structure is not a concession to complexity but the condition for invariant detection. A world rich in invariants cannot be exhausted by a single projection.
0
0
physics.hist-ph 2026-03-05 2 theorems

Local data cannot reveal if the universe had a beginning

Observational Indistinguishability and the Beginning of the Universe

Extensions of Malament-Manchak theorems show that singular and beginningless models remain indistinguishable to observers in almost all spac

abstract click to expand
Can we infer whether all of physical reality began to exist? Several novel results are offered suggesting a negative verdict. First, a common strategy for defending a cosmic beginning involves showing that individual beginningless cosmological models are implausible. This strategy is shown to make an elementary error in confirmation theory. Second, two necessary (but not necessarily sufficient) conditions are offered for a cosmic beginning. Third, three extensions are offered to the Malament-Manchak theorems. The three extensions show that in almost all classical spacetimes, observers cannot collect sufficient data to determine whether the application conditions for the classic singularity theorems are satisfied or whether their spacetime satisfies the two necessary conditions for a cosmic beginning. Lastly, a reply is offered to the objection that the skeptical consequences of the three extensions can be overcome with induction. Importantly, all past singular dust FLRW spacetimes have observationally indistinguishable counterparts which, while sharing a number of important local properties, either do not include a singularity to the past of every point or else do not have the sort of time ordering intuitively required for a cosmic beginning.
0
0
physics.hist-ph 2026-03-02 Recognition

Kramers derived the Dirac equation at the same time as Dirac

The Birth of Quantum Mechanics and the Dirac Equation

Reconstruction of his unpublished 1928 work plus two new modern derivations from Ehrenfest relations and hydrodynamics show multiple routes.

abstract click to expand
The year 2025 marked the centennial of quantum mechanics, inaugurated by Heisenberg's matrix formulation and the foundational contributions of Pauli, Schrodinger, and Dirac. Concurrently, 2026 marks the centennial of the Klein - Gordon equation, the second-order relativistic wave equation from which both the Schrodinger and Dirac equations were derived. This article supplements the recent review published in J.Phys. A: Math.Theor.,58 (2025) 053001 by providing a more detailed examination of the formative period 1925 - 1928, with particular attention to contributions that have received insufficient recognition in the standard narrative. We reconstruct Kramers' independent derivation of the Dirac equation - obtained essentially simultaneously with Dirac's own result yet unpublished for seven years - and discuss its relation to Van der Waerden's group-theoretical approach. The role of Charles Galton Darwin in elucidating the physical content of the Dirac equation is also highlighted. In addition, we present two modern derivations not catalogued in the earlier review: one based on Operational Dynamical Modeling, which deduces the Dirac equation from relativistic Ehrenfest relations and the canonical commutation algebra, and one rooted in the Madelung hydrodynamic formulation. Three broad periods of quantum theory development -- foundational, consolidation, and the modern era of quantum information -- are briefly surveyed.
0
0
physics.hist-ph 2026-02-20 2 theorems

Special sciences obey a causal second law of entropy

The Causal Second Law

Robust causal regularities come with entropy that cannot decrease from cause to effect under standard physicalist assumptions.

Figure from the paper full image
abstract click to expand
I argue that if a special science satisfies certain key assumptions that are familiar from physicalist accounts of the special sciences and from physics, then its causal regularities have an associated notion of entropy, and that this causal entropy cannot decrease from a robust cause to its effect. Due to its analogy with the second laws of thermodynamics and statistical physics, I call the latter conclusion the causal second law. In this paper, I clarify the key assumptions, prove the causal second law, give sufficient conditions for causal entropy increase, relate the causal second law to statistical mechanics and thermodynamics, and argue that the reversibility objection does not threaten it. In addition, I claim that the causal second law is compatible with a non-metaphysical understanding of supervenience and the open systems view, argue that it does not imply a causal time arrow, reflect on relaxing the robustness condition, question whether it is necessary to invoke thermodynamics to show that special sciences' time arrows exist, and discuss a transition-relative-frequency-based, special-science-internal characterization of causal regularities.
0

browse all of physics.hist-ph → full archive · search · sub-categories