Permanent link for this collection

Integrated History and Philosophy of Science: Fourth Conference

15–18 March, 2012

Department of History and Philosophy of Science, University of Athens, Greece


Recent Submissions

Now showing 1 - 20 of 24
  • Item
    Theodore Arabatzis; Don Howard; tarabatz@phs.uoa.gr and dhoward1@nd.edu
  • Item
    How Theories Begin: Max Planck and the Genesis of Quantum Theory
    (2012) Massimiliano Badino; massimiliano.badino@univr.it; Vassilios Karakostas
  • Item
    What does History Matter to Philosophy of Physics?
    (2012) Thomas Ryckman; tryckman@stanford.edu; Stathis Psillos
    Naturalized metaphysics remains the default presupposition of much contemporary philosophy of physics. As metaphysics is supposed to concern the general structure of reality, so scientific naturalism draws upon our best physical theories to attempt to answer the foundational question “par excellence ” viz., “how could the world possibly be the way this theory says it is?” A particular case study, Hilbert's attempt to analyze and explain a seeming “pre-established harmony” between mind and nature, is offered as a salutary reminder that naturalism's ready inference from physical theory to ontology may be too quick.
  • Item
    Where Positivism Went Right: The Positron and the Literal View of Theories
    (2012) Robert Rynasiewicz; ryno@jhu.edu; Michela Massimi
  • Item
    The Changing Relationship Between Simulation and Experiment: The Case of Quantum Chemistry
    (2012) Johannes Lenhard; johannes.lenhard@uni-bielefeld.de; Vassilios Karakostas
    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computational quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.
  • Item
    Galileo’s use of experimentation and the limits of nature
    (2012) Maarten Van Dyck; Maarten.VanDyck@UGent.be; Faidra Papanelopoulou
  • Item
    Vacuum Experiments in Cartesian Context
    (2012) Mihnea Dobre; mihnea.dobre@icub.unibuc.ro; Dionysios Anapolitanos
  • Item
    The Epistemic Structural Realist Program. Some interference
    (2012) Angelo Cei; angelo.cei@uniroma3.it; Michela Massimi
    In this paper, I present the Epistemic Structural Realist Program, illustrate the main arguments on its behalf and discuss its implications for a realist understanding of scientific change. Then I present and discuss the historical case study and its implications for ESR; Section 4 will draw a general moral from the previous discussion. In particular, I show that: a) Emphasis on the predictive power of purely formal aspects of electrodynamics and their partial/empirical interpretation would lead us completely astray in the interpretation of the Zeeman Effect and of its role in driving research in the fine structure of the atom through spectroscopic investigation. b) Arguments based on the history of science are ineffective in this case. Any attempt to explain the past success through the role of structural aspects of old theory involved in the account is unsuccessful. Both the points will allow to seeing the need for an alternative historical methodology to support the realist cause on the face of theoretical change.
  • Item
    The Norton Dome and the Nineteenth Century Foundations of Determinism
    (2012) Marij van Strien; marijvanstrien@gmail.com; John Norton
    The recent discovery of an indeterministic system in classical mechanics, the Norton dome, has shown that answering the question whether classical mechanics is deterministic can be a complicated matter. In this paper I show that indeterministic systems similar to the Norton dome were already known in the nineteenth century: I discuss four nineteenth century authors who wrote about such systems, namely Poisson, Duhamel, Boussinesq and Bertrand. However, I argue that their discussion of such systems was very different from the contemporary discussion about the Norton dome, because physicists in the nineteenth century conceived of determinism in essentially different ways: whereas in the contemporary literature on determinism in classical physics, determinism is usually taken to be a property of the equations of physics, in the nineteenth century determinism was primarily taken to be a presupposition of theories in physics, and as such it was not necessarily affected by the possible existence of systems such as the Norton dome.
  • Item
    Epistemology of a Believer: Making Sense of Duhem’s Anti-Atomism
    (2012) Klodian Coko; kchoko@indiana.edu; Alan Chalmers
    Pierre Duhem's (1861–1916) lifelong opposition to 19th century atomic theories of matter has been traditionally attributed to his conventionalist and/or positivist philosophy of science. Relatively recently, this traditional view position has been challenged by the claim that Duhem's opposition to atomism was due to the precarious state of atomic theories during the beginning of the 20th century. In this paper I present some of the difficulties with both the traditional and the new interpretation of Duhem's opposition to atomism and provide a new framework in which to understand his rejection of atomic hypotheses. I argue that although not positivist, instrumentalist, or conventionalist, Duhem's philosophy of physics was not compatible with belief in unobservable atoms and molecules. The key for understanding Duhem's resistance to atomism during the final phase of his career is the historicist arguments he presented in support of his ideal of physics.
  • Item
    The Early History of Chance in Evolution: Causal and Statistical in the 1890s
    (2012) Charles H. Pence; charles@charlespence.net; Jutta Schickore
    Work throughout the history and philosophy of biology frequently employs ‘chance’, ‘unpredictability’, ‘probability’, and many similar terms. One common way of understanding how these concepts were introduced in evolution focuses on two central issues: the first use of statistical methods in evolution (Galton), and the first use of the concept of “objective chance” in evolution (Wright). I argue that while this approach has merit, it fails to fully capture interesting philosophical reflections on the role of chance expounded by two of Galton's students, Karl Pearson and W.F.R. Weldon. Considering a question more familiar from contemporary philosophy of biology—the relationship between our statistical theories of evolution and the processes in the world those theories describe—is, I claim, a more fruitful way to approach both these two historical actors and the broader development of chance in evolution.
  • Item
    The role of the rotating frame thought experiment in the genesis of general relativity
    (2012) Jonathan Everett; jonathan.everett@uclmail.net; John Norton
  • Item
    Symmetries and conserved quantities in integrated historical-philosophical perspective
    (2012) Arianna Borrelli; orrelli@tu-berlin.de; Hasok Chang
    Mathematical invariances, usually referred to as “symmetries”, are today often regarded as providing a privileged heuristic guideline for understanding natural phenomena, especially those of micro-physics. The rise of symmetries in particle physics has often been portrayed by physicists and philosophers as the “application” of mathematical invariances to the ordering of particle phenomena, but no historical studies exist on whether and how mathematical invariances actually played a heuristic role in shaping microphysics. Moreover, speaking of an “application” of invariances conflates the formation of concepts of new intrinsic degrees of freedom of elementary particles with the formulation of models containing invariances with respect to those degrees of freedom. I shall present here a case study from early particle physics (ca. 1930–1954) focussed on the formation of one of the earliest concepts of a new degree of freedom, baryon number, and on the emergence of the invariance today associated to it. The results of the analysis show how concept formation and “application” of mathematical invariances were distinct components of a complex historical constellation in which, beside symmetries, two further elements were essential: the idea of physically conserved quantities and that of selection rules. I shall refer to the collection of different heuristic strategies involving selection rules, invariances and conserved quantities as the “SIC-triangle” and show how different authors made use of them to interpret the wealth of new experimental data. It was only a posteriori that the successes of this hybrid “symmetry heuristics” came to be attributed exclusively to mathematical invariances and group theory, forgetting the role of selection rules and of the notion of physically conserved quantity in the emergence of new degrees of freedom and new invariances. The results of the present investigation clearly indicate that opinions on the role of symmetries in fundamental physics need to be critically reviewed in the spirit of integrated history and philosophy of science.
  • Item
    The Objectivity of our Measures: How many Fundamental Units of Nature?
    (2012) Sally Riordan; sr206@cam.ac.uk; Hasok Chang
    The fundamental constants of nature, as presented by modern science, can be conceived as natural measures of the universe. In comparison, the standards of the International System of Units, including the kilogram and the meter, are mind-made and hand-crafted to meet the demands of human life. In this paper, the gap between the natural and the conventional is squeezed from two directions. In the first place, we come to understand why the metric measures were originally conceived, by the best of scientists, as being “taken from nature” and “in no way arbitrary”. The kilogram of yesteryear was anchored in yesteryear's science and is reasonably considered natural with respect to that science. We also review a contemporary debate amongst physicists that questions whether any quantity, being necessarily written with units, can be truly fundamental. Modern notions of a fundamental constant are put under the spotlight; the kilogram emerges as bound up with contemporary science today as ever it was. In the picture being painted here, our measures are drawn as dynamic entities, epistemic tools that develop hand-in-hand with the rest of science, and whose significance goes much further than a metal artefact dangled from an abstract number line.
  • Item
    Facing Giere’s challenges to the History and Philosophy of Science
    (2012) Samuel Schindler; samuel.schindler@css.au.dk; Jutta Schickore
  • Item
    “Maxwell’s Method of Physical Analogy and the Unreasonable Effectiveness of Mathematics”
    (2012) Alisa Bokulich; abokulic@bu.edu; John Norton
    The fact that the same equations or mathematical models reappear in the descriptions of what are otherwise disparate physical systems can be seen as yet another manifestation of Wigner's “unreasonable effectiveness of mathematics.” James Clerk Maxwell famously exploited such formal similarities in what he called the “method of physical analogy.” Both Maxwell and Hermann von Helmholtz appealed to the physical analogies between electromagnetism and hydrodynamics in their development of these theories. I argue that a closer historical examination of the different ways in which Maxwell and Helmholtz each deployed this analogy gives further insight into debates about the representational and explanatory power of mathematical models.
  • Item
    The Historical Roots of 19th Century Antirealism
    (2012) Michael Liston; mnliston@uwm.edu; Alan Chalmers
  • Item
    Underdetermination and Decomposition in Kepler’s Astronomia Nova
    (2012) Teru Miyake; tmiyake@ntu.edu.sg; Theodore Arabatzis
    This paper examines the underdetermination between the Ptolemaic, Copernican, and the Tychonic theories of planetary motions and its attempted resolution by Kepler. I argue that past philosophical analyses of the problem of the planetary motions have not adequately grasped a method through which the underdetermination might have been resolved. This method involves a procedure of what I characterize as decomposition and identification. I show that this procedure is used by Kepler in the first half of the Astronomia Nova, where he ultimately claims to have refuted the Ptolemaic theory, thus partially overcoming the underdetermination. Finally, I compare this method with other views of scientific inference such as bootstrapping.
  • Item
    ‘New wine in old bottles’: replicating alchemical experiments
    (2012) Jennifer Rampling; rampling@princeton.edu; Friedrich Steinle
    An influential strand of English alchemy was the pursuit of the “vegetable stone,” a medicinal elixir popularized by George Ripley (d. ca. 1490), made from a metallic substance, “sericon.” Yet the identity of sericon was not fixed, undergoing radical reinterpretation between the fifteenth and seventeenth centuries as Ripley’s lead-based practice was eclipsed by new methods, notably the antimonial approach of George Starkey (1628–65). Tracing “sericonian” alchemy over 250 years, I show how alchemists fed their practical findings back into textual accounts, creating a “feedback loop” in which the authority of past adepts was maintained by exegetical manipulations—a process that I term “practical exegesis.”