&HPS4

Permanent link for this collectionhttps://hdl.handle.net/2022/26064

Integrated History and Philosophy of Science: Fourth Conference

15–18 March, 2012

Department of History and Philosophy of Science, University of Athens, Greece

Browse

Recent Submissions

Now showing 1 - 20 of 24
  • Item
    The Changing Relationship Between Simulation and Experiment: The Case of Quantum Chemistry
    (2012) Johannes Lenhard; johannes.lenhard@uni-bielefeld.de; Vassilios Karakostas
    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.
  • Item
    Untitled
    Theodore Arabatzis; Don Howard; tarabatz@phs.uoa.gr and dhoward1@nd.edu
  • Item
    Galileo’s use of experimentation and the limits of nature
    (2012) Maarten Van Dyck; Maarten.VanDyck@UGent.be; Faidra Papanelopoulou
  • Item
    How Theories Begin: Max Planck and the Genesis of Quantum Theory
    (2012) Massimiliano Badino; massimiliano.badino@univr.it; Vassilios Karakostas
  • Item
    Where Positivism Went Right: The Positron and the Literal View of Theories
    (2012) Robert Rynasiewicz; ryno@jhu.edu; Michela Massimi
  • Item
    What does History Matter to Philosophy of Physics?
    (2012) Thomas Ryckman; tryckman@stanford.edu; Stathis Psillos
    Naturalized metaphysics remains the default presupposition of much contemporary philosophy of physics. As metaphysics is supposed to concern the general structure of reality, so scientific naturalism draws upon our best physical theories to attempt to answer the foundational question “par excellence ” viz., “how could the world possibly be the way this theory says it is?” A particular case study, Hilbert's attempt to analyze and explain a seeming “pre-established harmony” between mind and nature, is offered as a salutary reminder that naturalism's ready inference from physical theory to ontology may be too quick.
  • Item
    Vacuum Experiments in Cartesian Context
    (2012) Mihnea Dobre; mihnea.dobre@icub.unibuc.ro; Dionysios Anapolitanos
  • Item
    The Epistemic Structural Realist Program. Some interference
    (2012) Angelo Cei; angelo.cei@uniroma3.it; Michela Massimi
    In this paper, I present the Epistemic Structural Realist Program, illustrate the main arguments on its behalf and discuss its implications for a realist understanding of scientific change. Then I present and discuss the historical case study and its implications for ESR; Section 4 will draw a general moral from the previous discussion. In particular, I show that: a) Emphasis on the predictive power of purely formal aspects of electrodynamics and their partial/empirical interpretation would lead us completely astray in the interpretation of the Zeeman Effect and of its role in driving research in the fine structure of the atom through spectroscopic investigation. b) Arguments based on the history of science are ineffective in this case. Any attempt to explain the past success through the role of structural aspects of old theory involved in the account is unsuccessful. Both the points will allow to seeing the need for an alternative historical methodology to support the realist cause on the face of theoretical change.
  • Item
    The role of the rotating frame thought experiment in the genesis of general relativity
    (2012) Jonathan Everett; jonathan.everett@uclmail.net; John Norton
  • Item
    Facing Giere’s challenges to the History and Philosophy of Science
    (2012) Samuel Schindler; samuel.schindler@css.au.dk; Jutta Schickore
  • Item
    The Objectivity of our Measures: How many Fundamental Units of Nature?
    (2012) Sally Riordan; sr206@cam.ac.uk; Hasok Chang
    The fundamental constants of nature, as presented by modern science, can be conceived as natural measures of the universe. In comparison, the standards of the International System of Units, including the kilogram and the meter, are mind-made and hand-crafted to meet the demands of human life. In this paper, the gap between the natural and the conventional is squeezed from two directions. In the first place, we come to understand why the metric measures were originally conceived, by the best of scientists, as being “taken from nature” and “in no way arbitrary”. The kilogram of yesteryear was anchored in yesteryear's science and is reasonably considered natural with respect to that science. We also review a contemporary debate amongst physicists that questions whether any quantity, being necessarily written with units, can be truly fundamental. Modern notions of a fundamental constant are put under the spotlight; the kilogram emerges as bound up with contemporary science today as ever it was. In the picture being painted here, our measures are drawn as dynamic entities, epistemic tools that develop hand-in-hand with the rest of science, and whose significance goes much further than a metal artefact dangled from an abstract number line.
  • Item
    Epistemology of a Believer: Making Sense of Duhem’s Anti-Atomism
    (2012) Klodian Coko; kchoko@indiana.edu; Alan Chalmers
    Pierre Duhem's (1861–1916) lifelong opposition to 19th century atomic theories of matter has been traditionally attributed to his conventionalist and/or positivist philosophy of science. Relatively recently, this traditional view position has been challenged by the claim that Duhem's opposition to atomism was due to the precarious state of atomic theories during the beginning of the 20th century. In this paper I present some of the difficulties with both the traditional and the new interpretation of Duhem's opposition to atomism and provide a new framework in which to understand his rejection of atomic hypotheses. I argue that although not positivist, instrumentalist, or conventionalist, Duhem's philosophy of physics was not compatible with belief in unobservable atoms and molecules. The key for understanding Duhem's resistance to atomism during the final phase of his career is the historicist arguments he presented in support of his ideal of physics.
  • Item
    The Historical Roots of 19th Century Antirealism
    (2012) Michael Liston; mnliston@uwm.edu; Alan Chalmers
  • Item
    The Early History of Chance in Evolution: Causal and Statistical in the 1890s
    (2012) Charles H. Pence; charles@charlespence.net; Jutta Schickore
    Work throughout the history and philosophy of biology frequently employs ‘chance’, ‘unpredictability’, ‘probability’, and many similar terms. One common way of understanding how these concepts were introduced in evolution focuses on two central issues: the first use of statistical methods in evolution (Galton), and the first use of the concept of “objective chance” in evolution (Wright). I argue that while this approach has merit, it fails to fully capture interesting philosophical reflections on the role of chance expounded by two of Galton's students, Karl Pearson and W.F.R. Weldon. Considering a question more familiar from contemporary philosophy of biology—the relationship between our statistical theories of evolution and the processes in the world those theories describe—is, I claim, a more fruitful way to approach both these two historical actors and the broader development of chance in evolution.
  • Item
    Symmetries and conserved quantities in integrated historical-philosophical perspective
    (2012) Arianna Borrelli; orrelli@tu-berlin.de; Hasok Chang
    Mathematical invariances, usually referred to as “symmetries”, are today often regarded as providing a privileged heuristic guideline for understanding natural phenomena, especially those of micro-physics. The rise of symmetries in particle physics has often been portrayed by physicists and philosophers as the “application” of mathematical invariances to the ordering of particle phenomena, but no historical studies exist on whether and how mathematical invariances actually played a heuristic role in shaping microphysics. Moreover, speaking of an “application” of invariances conflates the formation of concepts of new intrinsic degrees of freedom of elementary particles with the formulation of models containing invariances with respect to those degrees of freedom. I shall present here a case study from early particle physics (ca. 1930–1954) focussed on the formation of one of the earliest concepts of a new degree of freedom, baryon number, and on the emergence of the invariance today associated to it. The results of the analysis show how concept formation and “application” of mathematical invariances were distinct components of a complex historical constellation in which, beside symmetries, two further elements were essential: the idea of physically conserved quantities and that of selection rules. I shall refer to the collection of different heuristic strategies involving selection rules, invariances and conserved quantities as the “SIC-triangle” and show how different authors made use of them to interpret the wealth of new experimental data. It was only a posteriori that the successes of this hybrid “symmetry heuristics” came to be attributed exclusively to mathematical invariances and group theory, forgetting the role of selection rules and of the notion of physically conserved quantity in the emergence of new degrees of freedom and new invariances. The results of the present investigation clearly indicate that opinions on the role of symmetries in fundamental physics need to be critically reviewed in the spirit of integrated history and philosophy of science.
  • Item
    The Norton Dome and the Nineteenth Century Foundations of Determinism
    (2012) Marij van Strien; marijvanstrien@gmail.com; John Norton
    The recent discovery of an indeterministic system in classical mechanics, the Norton dome, has shown that answering the question whether classical mechanics is deterministic can be a complicated matter. In this paper I show that indeterministic systems similar to the Norton dome were already known in the nineteenth century: I discuss four nineteenth century authors who wrote about such systems, namely Poisson, Duhamel, Boussinesq and Bertrand. However, I argue that their discussion of such systems was very different from the contemporary discussion about the Norton dome, because physicists in the nineteenth century conceived of determinism in essentially different ways: whereas in the contemporary literature on determinism in classical physics, determinism is usually taken to be a property of the equations of physics, in the nineteenth century determinism was primarily taken to be a presupposition of theories in physics, and as such it was not necessarily affected by the possible existence of systems such as the Norton dome.
  • Item
    “Maxwell’s Method of Physical Analogy and the Unreasonable Effectiveness of Mathematics”
    (2012) Alisa Bokulich; abokulic@bu.edu; John Norton
    The fact that the same equations or mathematical models reappear in the descriptions of what are otherwise disparate physical systems can be seen as yet another manifestation of Wigner's “unreasonable effectiveness of mathematics.” James Clerk Maxwell famously exploited such formal similarities in what he called the “method of physical analogy.” Both Maxwell and Hermann von Helmholtz appealed to the physical analogies between electromagnetism and hydrodynamics in their development of these theories. I argue that a closer historical examination of the different ways in which Maxwell and Helmholtz each deployed this analogy gives further insight into debates about the representational and explanatory power of mathematical models.
  • Item
    Pluribus Ergo Existentibus Centris: Explanations, Descriptions, and Copernicus
    (2012) David Marshall Miller; david.miller@iastate.edu; Theodore Arabatzis
  • Item
    Incommensurability and Evidence
    (2012) Jed Buchwald; buchwald@caltech.edu; Kostas Gavroglu
    Incommensurability between successive scientific theories—the impossibility of empirical evidence dictating the choice between them—was Thomas Kuhn's most controversial proposal. Toward defending it, he directed much effort over his last 30 years into formulating precise conditions under which two theories would be undeniably incommensurable with one another. His first step, in the late 1960s, was to argue that incommensurability must result when two theories involve incompatible taxonomies. The problem he then struggled with, never obtaining a solution that he found entirely satisfactory, was how to extend this initial line of thought to sciences like physics in which taxonomy is not so transparently dominant as it is, for example, in chemistry. This paper reconsiders incommensurability in the light of examples in which evidence historically did and did not carry over continuously from old laws and theories to new ones. The transition from ray to wave optics early in the nineteenth century, we argue, is especially informative in this regard. The evidence for the theory of polarization within ray optics did not carry over to wave optics, so that this transition can be regarded as a prototypical case of discontinuity of evidence, and hence of incommensurability in the way Kuhn wanted. Yet the evidence for classic geometric optics did carry over to wave optics, notwithstanding the fundamental conceptual readjustment that Fresnel's wave theory required.