Abstracts for Thursday Philosophy of Physics seminars

HILARY TERM 2020

Week 1 (23 January): Anders Sandberg (Oxford): ‌Physical eschatology: how much can we say about the far future of the universe, and how much does it matter?

Abstract: Historically science has been reluctant to make long-term predictions about the future. One interesting exception is astronomy, where the combination of relatively low complexity, low noise environments and large timespans as well as strong theories have allowed the field of physical eschatology to emerge. This talk will outline the development of physical eschatology, discuss the current main models, and try to analyse the methodological challenges of such extreme long-range predictions, especially in the light of longtermist ethics increasingly being interested in some of the results as being potentially relevant for deciding near-term strategies.

Week 2 (30 January). Mauro Dorato (University of Rome 3): ‌Overcoming dynamical explanations with structural explanations

Week 3 (6–7 February) The first Oxford Philosophy of Physics Graduate Conference.
https://philphysgradconference.com

Week 4 (13 February). John Dougherty (Munich): TBC

Week 5 (20 February) J. Brian Pitts (Cambridge): ‌Constraints, Gauge, Change and Observables in Hamiltonian General Relativity

Week 6 (27 February)-Simon Saunders (Oxford): TBC

Week 7 (5 March) – no seminar

Week 8 (12 March). Emily Adlam, BLOC Seminar at King’s College, London: TBC.

MICHAELMAS TERM 2019

Week 1 (17th October): Patricia Palacios (Philosophy, University of Salzburg).

Title: Re-defining equilibrium for long-range interacting systems

Abstract: Long-range interacting systems are systems in which the interaction potential decays slowly for large inter-particle distance. Typical examples of long-range interactions are the gravitational and Coulomb forces. The philosophical interest for studying these kinds of systems has to do with the fact that they exhibit properties that escape traditional definitions of equilibrium based on average ensembles. Some of those properties are ensemble inequivalence, negative specific heat, negative susceptibility and ergodicity breaking. Focusing on long-range interacting systems has thus the potential of leading one to an entirely different conception of equilibrium or, at least, to a revision of traditional definitions of it. But how should we define equilibrium for long-range interacting systems?

In this talk, I address this question and argue that the problem of defining equilibrium in terms of average ensembles is due to the lack of a time-scale in the statistical mechanical treatment. In consequence, I argue that adding a specific time-scale to the statistical treatment can give us a satisfactory definition of equilibrium in terms of metastable states. I point out that such a time-scale depends on the number of particles in the system, as it happens when phase transitions occur, also in the more usual context of short-range interacting systems like condensed matter ones. I thus discuss the analogies and the dissimilarities between the case of long-range systems and that of phase transitions and argue that these analogies, which should be interpreted as liberal formal analogies, can have an important heuristic role in the development of statistical mechanics for long range interacting systems.

Week 2 (24th October): Francesca Chadha-Day (Physics, University of Cambridge).

Title: Dark Matter: Understanding the gravity of the situation

Abstract: The existence of Dark Matter – matter that is unaccounted for by the Standard Model of particle physics – is supported by a staggering quantity and variety of astrophysical observations. A plethora of Dark Matter candidates have been proposed. Dark matter may be cold, warm or fuzzy. It may be composed of right-handed neutrinos, supersymmetric particles, axions or primordial black holes. I will give an overview of Dark Matter candidates and how we can understand the phenomenological differences between them in the framework of quantum theory. I will discuss the difficulties faced by modified gravity theories in explaining our observations, and their relation to Dark Matter.

Week 3 (31st October): NO SEMINAR

Week 4 (7th November): Jamee Elder (Philosophy, University of Notre Dame/University of Bonn).

Title: The epistemology of LIGO

Abstract: In this talk, I examine the methodology and epistemology of LIGO, with a focus on the role of models and simulations in the experimental process. This includes post-Newtonian approximations, models generated through the effective one-body formalism, and numerical relativity simulations, as well as hybrid models that incorporate aspects of all three approaches. I then present an apparent puzzle concerning the validation of these models: how can we successfully validate these models and simulations through our observations of black holes, given that our observations rely on our having valid models of the systems being observed? I argue that there is a problematic circularity here in how we make inferences about the properties of compact binaries. The problem is particularly acute when we consider these experiments as empirical tests of general relativity. I then consider strategies for responding to this challenge.

Week 5 (14th November): Adam Caulton (Philosophy, University of Oxford).

Title; Is a particle an irreducible representation of the Poincaré group?

Abstract: Ever since investigations into the group representation theory of spacetime symmetries, chiefly due to Wigner and Bargmann in the 1930s and ‘40s, it has become something of a mantra in particle physics that a particle is an irreducible representation of the Poincaré group (the symmetry group of Minkowski spacetime). Call this ‘Wigner’s identification’. One may ask, in a philosophical spirit, whether Wigner’s identification could serve as something like a real definition (as opposed to a nominal definition) of ‘particle’—at least for the purposes of relativistic quantum field theory. In this talk, I aim to show that, while Wigner’s identification is materially adequate for many purposes—principally scattering theory—it does not provide a serviceable definition. The main problem, or so I shall argue, is that the regime of legitimate particle talk surpasses the constraints put on it by Wigner’s identification. I aim further to show that, at least in the case of particles with mass, a promising rival definition is available. This promising rival emerges from investigations due to Foldy in the 1950s, which I will outline. The broad upshot is that the definition of ‘particle’ may well be the same in both the relativistic and non-relativistic contexts, and draws upon not the Poincaré group (or any other spacetime symmetry group) but rather the familiar Heisenberg relations.

Week 6 (21st November): Radin Dardashti (Philosophy, University of Wuppertal).

Title: Understanding Problems in Physics

Abstract: In current fundamental physics empirical data is scarce, and it may take several decades before the hypothesised solution to a scientific problem can be tested. So, scientists need to be careful in assessing what constitutes a scientific problem in the first place, for there may be the danger of providing a solution to a non-existing problem. Relying and extending on previous work by Larry Laudan and Thomas Nickles, I apply the philosophical discussion on scientific problems to modern particle physics.

Week 7 (28th November): NO SEMINAR

Week 8 (5th December): Karim Thébault (Philosophy, University of Bristol).

Title: Time and Background Independence

Abstract: We showcase a new framework for the analysis of the symmetries and spatiotemporal structure of a physical theory via the application to the problem of differentiating intuitively background dependent theories from intuitively background independent theories. This problem has been rendered a particularly pressing one by the magisterial analysis of Pooley (2017), who convincingly demonstrates that diffeomorphism invariance cannot be equated with background independence via reference to the comparison between diffeomorphism invariant special relativity and general relativity.

Our framework is built upon the analysis of the transformation behaviour of nomic and temporal structures under kinematical transformations (defined as endomorphisms on the space of kinematically possible models). We define the sub-regions with the space of kinematical transformations corresponding to where a structure is absolute (does not vary) and relative (does vary) and then classify temporal structures via the intersection of their absolute and relative regions with those of nomic structures. Of particular relevance is the case where there is non-trivial overlap between the relative region of some temporal structure and both the absolute and relative regions of the nomic structure. We classify such structure as dynamical surplus structure.

Finally, based upon the analysis of temporal foliation structure, we provide a new account of background independence. On our account background independence manifestly fails for diffeomorphism invariant special relativity (since the temporal foliation structure is non-dynamical surplus structure) and obtains for general relativity (since the temporal foliation structure is dynamical surplus structure). This formalises the intuitive idea of the contingent independence of the dynamical models of general relativity from a spatiotemporal background.

Week 9 (12th December): Barry Loewer (Philosophy, Rutgers) Title: The package deal account of fundamental laws

Abstract: In my talk I will describe an account of the metaphysics of fundamental laws I call “the Package Deal Account (PDA)” that is a descendent of Lewis’ BSA but differs in a number of ways. First, it does not require the truth of a thesis Lewis call “Humean Supervenience” (HS) and so can accommodate relations and structures found in contemporary physics that conflict with HS. Second, it is not committed to Humeanism since it is compatible with there being fundamental necessary connections in nature. Third, it greatly develops the criteria for what counts in favor of a candidate system to determine laws. Fourth and most significantly, unlike the BSA the PDA does not presuppose metaphysically primitive elite properties/quantities that Lewis calls “perfectly natural properties/quantities.

TRINITY TERM 2019

Week1 (Thursday May 2) Olivier Darrigol (Paris): Ludwig Boltzmann: Atoms, mechanics, and probability

Statistical mechanics owes much more to Ludwig Boltzmann than is usually believed. In his attempts to derived thermodynamic and transport phenomena from deeper microphysical assumptions, he explored at least five different approaches: One based on mechanical analogies (with periodic mechanical systems or with statistical ensembles), one based on Maxwell’s collision formula, one based on the ergodic hypothesis, one based on combinatorial probabilities, and one based on the existence of thermodynamic equilibrium. I will sketch this various approaches and show how Boltzmann judged them and interconnected them. It will also argue that in general Boltzmann was more concerned with constructive efficiency than with precise conceptual foundations. Basic questions on the reality of atoms or on the nature of probabilities played only a secondary role in his theoretical enterprise.

Week 2 (Thursday May 9) No seminar

Week 3 (Thursday May 16) Jeremy Butterfield (Cambridge): On realism and functionalism about space and time

(Joint worth with Henrique Gomes.) In this talk I will set the recent literature on spacetime functionalism in context, by discussing two traditions that form its background. First: functionalism in general, as a species of inter-theoretic reduction. Second: relationism about space and time.

Week 4 (Thursday May 23): No seminar

Week 5 (Thursday May 30) Henrique Gomes (Perimeter and Cambridge): Gauge, boundaries, and the connection form

Forces such as electromagnetism and gravity reach across the Universe; they are the long-ranged forces in current physics. And yet, in many applications—theoretical and otherwise—we only have access to finite domains of the world. For instance, in computations of entanglement entropy, e.g. for black holes or cosmic horizons, we raise boundaries to separate the known from the unknown. In this talk, I will argue we do not understand gauge theory as well as we think we do, when boundaries are present.

For example: It is agreed by all that we should aim to construct variables that have a one to one relationship to the theory’s physical content within bounded regions. But puzzles arise if we try to combine definitions of strictly physical variables in different parts of the world. This is most clearly gleaned by first employing the simplest tool for unique physical representation—gauge fixings—and then proceeding to stumble on its shortcomings. Whereas fixing the gauge can often shave off unwanted redundancies, the coupling of different bounded regions requires the use of gauge-variant elements. Therefore, the coupling of regional observables is inimical to gauge-fixing, as usually understood. This resistance to gauge-fixing has led some to declare the coupling of subsystems to be the raison d’être of gauge [Rov14].

Here I will explicate the problems mentioned above and illustrate a possible resolution. The resolution was introduced in a recent series of papers [Gomes & Riello JHEP ’17,Gomes & Riello PRD ’18,Gomes, Hopfumuller, Riello NPB ’19]. It requires the notion of a connection-form in the field-space of gauge theories. Using this tool, a modified version of symplectic geometry—here called ‘horizontal’—is possible. Independently of boundary conditions, this formalism bestows to each region a physically salient, relational notion of charge: the horizontal Noether charge. It is relational in the sense that it only uses the different fields already at play and relationships between them; no new “edge-mode” degrees of freedom are required.

The guiding requirement for the construction of the relational connection-form is simply a harmonious melding of regional and global observables. I show that the ensuing notions of regional relationalism are different from other attempts at resolving the problem posed by gauge symmetries for bounded regions. The distinguishing criterion is what I consider to be the ‘acid test’ of local gauge theories in bounded regions: does the theory license only those regional charges which depend solely on the original field content? In a satisfactory theory, the answer should be “yes”. Lastly, I will introduce explicit examples of relational connection-forms, and show that the ensuing horizontal symplectic geometry passes this ‘acid test’.”

Week 7 (Thursday June 13) Harvey Brown (Oxford): Aspects of probabilistic reasoning in physics

Week 8 (Thursday June 20) Martin Lesourd (Oxford): The epistemic constraints that observers face in General Relativistic spacetimes

What can observers know about their future and their own spacetime on the basis of their past lightcones? Important contributions to this question were made by Earman, Geroch, Glymour, Malament and more recently Manchak. Building on the work of Malament, Manchak (2009/10/11/14) has been able to prove what seem to be general and far reaching results to the effect that observers in general relativistic spacetimes face severe epistemic constraints. Here, after reviewing these results, I shall present a number of new results which grant observers a more positive epistemic status. So in short: if Malament and Manchak’s results were cause for a form of epistemic pessimism, then the ones presented here will strive for a more optimistic outlook.

HILARY TERM 2019

Week 1 (17 Jan) Laurenz Hudetz (LSE): The conceptual-schemas account of interpretation

This talk addresses the question what it is to interpret a formalism. It aims to provide a general framework for talking about interpretations (of various kinds) in a rigorous way. First, I clarify what I mean by a formalism. Second, I give an account of what it is to establish a link between a formalism and data. For this purpose, I draw on the theory of relational databases to explicate what data schemas and collections of data are. I introduce the notion of an interpretive link using tools from mathematical logic. Third, I address the question how a formalism can be interpreted in a way that goes beyond a connection to data. The basic idea is that one extends a data schema to an ontological conceptual schema and links the formalism to this extended schema. I illustrate this account of interpretation by means of simple examples and highlight how it can be fruitfully applied to address conceptual problems in philosophy of science.

Week 2 (24 Jan)  Patrick Dürr (Oxford): Philosophy of the Dead: Nordström Gravity

The talk revisits Nordström Gravity (NoG) – arguably the most plausible relativistic scalar theory of gravity before the advent of GR. In Nordström’s original formulation (1913), NoG1, it appears to describe a scalar gravitational field on Minkowski spacetime. In 1914, Fokker and Einstein showed that NoG is mathematically equivalent to a purely metric theory, NoG2 – strikingly similar to the Einstein Equations. Like GR, NoG2 is plausibly construed as a geometrised theory of gravity: In NoG2, gravitational effects are reduced to manifestations of non-Minkowskian spacetime structure. Both variants of NoG, and their claimed physical equivalence, give rise to three conundrums that we will explore.

(P1) The (Weak) Equivalence Principle appears to be violated in NoG1 – but holds in NoG2. (P2) In NoG1, it appears unproblematic to ascribe the gravitational scalar an energy-momentum tensor. In trying to define gravitational energy in NoG2, by contrast, one faces problems akin to those in GR. (P3) In NoG1, total (i.e. gravitational plus non-gravitational) energy-momentum appears to be conserved, whereas in NoG2, no obvious candidate for gravitational energy is available, and furthermore it seems unclear whether non-gravitational energy is conserved.

In as far as NoG1 and NoG2 are equivalent formulations of the same theory, (P1)-(P3) appear paradoxical. For a resolution, I will proffer a metaphysically perspicuous articulation of NoG’s ontology that explicates the equivalence, and propose an instructive reformulation.

Week 3 (31 Jan)  Yang-Hui He (City, London): Exceptional and Sporadic

I give an overview of a host of seemingly unrelated classification problems in mathematics which turn out to be intimately connected through some deep Correspondences.

Some of these relations were uncovered by focusing on so-called exceptional structures which abound: in geometry, there are the Platonic solids; in algebra, there are the exceptional Lie algebras; in group theory, there are the sporadic groups, to name but a few. A champion for such Correspondences is Prof. John McKay.

I also present how these correspondences have subsequently been harnessed by theoretical physicists. My goal is to take a casual promenade in this land of ‘exceptionology’, reviewing some classic results and presenting some new ones based on joint work with Prof. McKay.

Week 4 (7 Feb)  Casey McCoy (Stockholm): Why is h-bar a universal constant?

Some constants are relevant for all physical phenomena: the speed of light pertains to the causal structure of spacetime and hence all physical processes. Others are relevant only to particular interactions, for example the fine structure constant. Why is Planck’s constant one of the former? I motivate the possibility that there could have been multiple, interaction-specific ‘Planck constants. Although there are indeed good reasons to eschew this possibility, it suggests a further question: what is the actual conceptual significance of Planck’s constant in quantum physics? I argue that it lies principally in relating classical and quantum physics,  and I draw out two main perspectives on this relation, represented by the views of Landsman and Lévy-Leblond.

Week 5 (14 Feb)  Joanna Luc (Cambridge): Generalised manifolds as basic objects of General Relativity

In the definition of a differential manifold in General Relativity (GR) the Hausdorff condition is typically assumed. In my talk I will investigate the consequences of dropping this condition. I will argue that there are good reasons to regard non-Hausdorff manifolds as basic objects of GR, together with Hausdorff manifolds. However, it is not clear whether they can be regarded as physically reasonable basic objects of GR. I will argue that some of the objections to their physical reasonability can be refuted if we understand them as representing a bundle of alternative spacetimes. This interpretation is supported by a theorem stating that every non-Hausdorff manifold can be seen as a result of gluing together some Hausdorff manifolds.

Week 6 (21 Feb)  Katie Robertson (Birmingham): Reducing the second law of thermodynamics: the demons and difficulties.

In this talk I consider how to reduce the second law of thermodynamics. I first discuss what I mean by ‘reduction’, and emphasis how functionalism can be helpful in securing reductions. Then I articulate the second law, and discuss what the ramifications of Maxwell’s demon are for the status of the second law. Should we take Maxwell’s means-relative approach? I argue no: the second law is not a relic of our inability to manipulate individual molecules in the manner of the nimble-fingered demon. When articulating the second law, I take care to distinguish it from the minus first law (Brown and Uffink 2001); the latter concerns the spontaneous approach to equilibrium whereas the former concerns the thermodynamic entropy change between equilibrium states, especially in quasi-static processes. Distinguishing these laws alters the reductive project (Luczak 2018): locating what Callender (1999) calls the Holy Grail – a non-decreasing statistical mechanical quantity to call entropy – is neither necessary nor sufficient. Instead, we must find a quantity that plays the right role, viz. to be constant in adiabatic quasi-static processes and increasing in non-quasi-static processes, and I argue that the Gibbs entropy plays this role.

Week 7 (28 Feb)  Alex Franklin (KCL): On the Effectiveness of Effective Field Theories

Effective Quantum Field Theories (EFTs) are effective insofar as they apply within a prescribed range of length-scales, but within that range they predict and describe with extremely high accuracy and precision. I will argue that the effectiveness of EFTs is best explained in terms of the scaling behaviour of the parameters. The explanation relies on distinguishing autonomy with respect to changes in microstates (autonomy_ms), from autonomy with respect to changes in microlaws (autonomy_ml), and relating these, respectively, to renormalisability and naturalness. It is claimed, pace Williams (2016), that the effectiveness of EFTs is a consequence of each theory’s renormalisability rather than its naturalness. This serves to undermine an important argument in favour of the view that only natural theories are kosher. It has been claimed in a number of recent papers that low-energy EFTs are emergent from their high-energy counterparts, see e.g. Bain (2013) and Butterfield (2014). Building on the foregoing analysis, I will argue that the emergence of EFTs may be understood in terms of the framework developed in Franklin and Knox (2018).

Week 8 (7 Mar)  Karen Crowther (Geneva): As Below, So Before: Synchronic and Diachronic Conceptions of Emergence in Quantum Gravity

The emergence of spacetime from quantum gravity appears to be a striking case-study of emergent phenomena in physics (albeit one that is speculative at present). There are, in fact, two different cases of emergent spacetime in quantum gravity: a “synchronic” conception, applying between different levels of description, and a “diachronic” conception, from the universe “before” and after the “Big Bang” in quantum cosmology. The purpose of this paper is to explore these two different senses of spacetime emergence; and to see whether, and how, they can be understood in the context of specific extant accounts of emergence in physics.

MICHAELMAS TERM 2018

Week 1 (11 Oct): David Wallace (USC): Spontaneous symmetry breaking in finite quantum systems: a decoherent-histories approach.

Abstract: Spontaneous symmetry breaking (SSB) in quantum systems, such as ferromagnets, is normally described as (or as arising from) degeneracy of the ground state; however, it is well established that this degeneracy only occurs in spatially infinite systems, and even better established that ferromagnets are not spatially infinite. I review this well-known paradox, and consider a popular solution where the symmetry is explicitly broken by some external field which goes to zero in the infinite-volume limit; although this is formally satisfactory, I argue that it must be rejected as a physical explanation of SSB since it fails to reproduce some important features of the phenomenology. Motivated by considerations from the analogous classical system, I argue that SSB in finite systems should be understood in terms of the approximate decoupling of the system’s state space into dynamically-isolated sectors, related by a symmetry transformation; I use the formalism of decoherent histories to make this more precise and to quantify the effect, showing that it is more than sufficient to explain SSB in realistic systems and that it goes over in a smooth and natural way to the infinite limit.

Week 2 (18 Oct): Simon Saunders (Oxford): Understanding indistinguishabilty.

Abstract:Indistinguishable entities are usually thought to be exactly alike, but not so in quantum mechanics — nor need the concept be restricted to the quantum domain. The concept, properly understood, can be applied in any context in which the only dynamically-salient state-independent properties are the same (so a fortiori in classical statistical mechanics).

The connection with the Gibbs paradox, and the reasons why the concept of classical indistinguishable particles has been so long resisted, are also discussed. The latter involves some background in the early history of quantum mechanics. This work builds on a recent publication, ‘The Gibbs Paradox’, Entropy (2018) 20(8), 552.

Week 3 (25 Oct): NO SEMINAR

Week 4 (1 Nov): NO SEMINAR

Week 5 (8 Nov): Tushar Menon (Oxford): Rotating spacetimes and the relativistic null hypothesis

Abstract: Recent work in the physics literature demonstrates that, in particular classes of rotating spacetimes, physical light rays do not, in general, traverse null geodesics. In this talk, I discuss its philosophical significance, both for the clock hypothesis (in particular, for Sam Fletcher’s recent purported proof thereof for light clocks), and for the operational meaning of the metric field in GR. (This talk is based on joint work with James Read and Niels Linnemann)

Week 6 (15 Nov): Jonathan Barrett (Oxford): Quantum causal models

Abstract: From a discussion of how to generalise Reichenbach’s Principle of the Common Cause to the case of quantum systems, I will develop a formalism to describe any set of quantum systems that have specified causal relationships between them. This formalism is the nearest quantum analogue to the classical causal models of Judea Pearl and others. I will illustrate the formalism with some simple examples, and if time, describe the quantum analogue of a well known classical theorem that relates the causal relationships between random variables to conditional independences in their joint probability distribution. I will end with some more speculative remarks concerning the significance of the work for the foundations of quantum theory.

Week 7 (22 Nov): James Nguyen (IoP/UCL): Interpreting Models: A Suggestion and its Payoffs

Abstract: I suggest that the representational content of a scientific model is determined by a `key’ associated with it. A key allows the model’s users to draw inferences about its target system. Crucially, these inferences need not be a matter of proposed similarity (structural or otherwise) to its target but can allow for much more conventional associations between model features and features to be exported. Although this is a simple suggestion, it has broad ramifications. I point out that it allows us to re-conceptualise what we mean by `idealisation’: just because a model is a distortion of its target (in the relevant respects, and even essentially so), this does not entail that it is a misrepresentation. I show how, once we think about idealisation in this way, various puzzles in the philosophy of science dissolve (the role of fictional models in science; the non-factivity of understanding; the problem of inconsistent models; and others).

TRINITY TERM 2018

Week 1 (26 Apr) Doreen Fraser (University of Waterloo): Renormalization and scaling transformations in quantum field theory

Abstract: Renormalization is a mathematical operation that needs to be carried out to make empirical sense of quantum field theories (QFTs). How should renormalized QFTs be physically interpreted? A prominent non-perturbative strategy for renormalizing QFTs is to draw on formal analogies with classical statistical mechanical models for critical phenomena. This strategy is implemented in both the Wilsonian renormalization group approach and the Euclidean approach to constructing models of the Wightman axioms. Each approach features a scaling transformation, but the scaling transformations are given different interpretations. I will analyze the two interpretations and argue that the approaches offer compatible and complementary perspectives on renormalization.

Week 2 (3 May) Jeremy Butterfield (Cambridge): On Dualities and Equivalences Between Physical Theories

Abstract:The main aim of this paper is to make a remark about the relation between (i) dualities between theories, as `duality’ is understood in physics and (ii) equivalence of theories, as `equivalence’ is understood in logic and philosophy. The remark is that in physics, two theories can be dual, and accordingly get called `the same theory’, though we interpret them as disagreeing—so that they are certainly not equivalent, as `equivalent’ is normally understood. So the remark is simple: but, I shall argue, worth stressing—since often neglected. My argument for this is based on the account of duality by De Haro: which is illustrated here with several examples, from both elementary physics and string theory. Thus I argue that in some examples, including in string theory, two dual theories disagree in their claims about the world. I also spell out how this remark implies a limitation of proposals (both traditional and recent) to understand theoretical equivalence as either logical equivalence or a weakening of it.

Week 3 (10 May) Matt Farr (Cambridge): The C Theory of Time

Abstract: Does time have a direction? Intuitively, it does. After all, our experiences, our thoughts, even our scientific explanations of phenomena are time-directed; things evolve from earlier to later, and it would seem unnecessary and indeed odd to try to expunge such talk from our philosophical lexicon. Nevertheless, in this talk I will make the case for what I call the C theory of time: in short, the thesis that time does not have a direction. I will do so by making the theory as palatable as possible, and this will involve giving an account of why it is permissible and indeed useful to talk in time-directed terms, what role time-directed explanations play in science, and why neither of these should commit us to the claim that reality is fundamentally directed in time. On the positive side, I will make the case that the C theory’s deflationism about the direction of time offers a superior account of time asymmetries in physics than rival time-direction-realist accounts.

Week 4 (17 May) Seth Lloyd (MIT): The future of Quantum Computing.

Abstract:Technologies for performing quantum computation have progressed rapidly over the past few years. This talk reviews recent advances in constructing quantum computers, and discusses applications for the kinds of quantum computers that are likely to be available in the near future. While full blown error corrected quantum computers capable of factoring large numbers are some way away, quantum computers with 100-1000 qubits should be available soon. Such devices should be able to solve problems quantum simulation and quantum machine learning that are beyond the reach of the most powerful classical computers. The talk will also discuss social aspects of quantum information, including the proliferation of start ups and the integration of quantum technologies in industry.

Week 5 (24 May) Emily Thomas (Durham): John Locke: Newtonian Absolutist about Time?

Abstract:John Locke’s metaphysics of time are relatively neglected but he discussed time throughout his career, from his unpublished 1670s writings to his 1690 Essay Concerning Human Understanding, and beyond. The vast majority of scholars who have written on Locke’s metaphysics of time argue that Locke’s views underwent an evolution: from relationism, the view that time and space are relations holding between bodies; to Newtonian absolutism, on which time and space are real, substance-like entities that are associated with God’s eternal duration and infinite immensity. Against this majority reading, I argue that Locke remained a relationist in the Essay, and throughout his subsequent career.

Week 6 (31 May) Minhyong Kim (Oxford): Three Dualities

Abstract:This talk will present a few contemporary points of view on geometry, with particular emphasis on dualities. Most of the talk will be concerned with mathematical practice, but will be interspersed with brief and superficial allusions to physics.

Week 7 (7 Jun) Owen Maroney (Oxford): TBC

Abstract:TBC

Week 8 (14 Jun) Tushar Menon (Oxford): TBC

Abstract: TBC

HILARY TERM 2018

Week 2 (25 Jan) Giulio Chiribella (Oxford): The Purification Principle

Abstract: Over the past decades there has been an intense work aiming at the reconstruction of quantum theory from principles that can be formulated without the mathematical framework of Hilbert spaces and operator algebras. The motivation was that these principles could provide a new angle to understand into the counterintuitive quantum laws, that they could reveal connections between different quantum features, and that they could provide guidance for constructing new quantum algorithms and for extending quantum theory to new physical scenarios.

In this talk I will discuss on one such principle, called the Purification Principle. Informally, the idea of the Purification Principle is that it is always possible to combine the incomplete information gathered by an observer with a maximally informative picture of the physical world. This idea resonates with Schrödinger’s famous quote that “[in quantum theory] the best possible knowledge of a whole does not necessarily imply the best possible knowledge of its parts”, a property that he called “not one, but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought.”

References for this talk:
GC, GM D’Ariano, and P Perinotti, Probabilistic Theories with Purification, Phys. Rev. A 81, 062348 (2010)
GC, GM D’Ariano, and P Perinotti, Informational Derivation of Quantum Theory, Phys. Rev. A 84, 012311 (2011)
GM D’Ariano, GC, and P Perinotti, Quantum Theory From First Principles, Cambridge University Press (2017).

Week 3 (1 Feb) Christopher Timpson (Oxford): Concepts of fundamentality: the case of information

Abstract: A familiar – perhaps traditional – conception of fundamental physics is as follows: physics presents the world as being populated at the basic level (or the pro tem basic level) by various fields or/and particles, and it furnishes equations describing how these items evolve and interact with one another over time, equations couched primarily in terms of such properties as energy, mass, and various species of charge. This evolution may be conceived to take place against a fixed background spatiotemporal arena of some kind, or in one alternative, it may well be conceived that the arena has a metrical structure which should also be treated as particular kind of field, itself subject to dynamical equations (as in General Relativity). But in recent years, stemming primarily from developments in quantum information theory and related thinking in quantum theory itself, an alternative conception has been gaining momentum, one which sees the concept of information playing a much more fundamental role in physics than this traditional picture would allow. This alternative conception urges that information must be recognised as a fundamental physical quantity, a quantity which in some sense should be conceived as being on a par with energy, mass, or charge. Perhaps even, according to strong versions of the conception, information should be seen as the new subject-matter for physics, displacing the traditional conception of material particles and fields as being the fundamental subject matter.

These are bold and interesting claims on the part of information, and regarding what is said to follow from the successes of quantum information theory. Are they well-motivated? Are they true? I will explore these issues by attempting, first of all, to delineate various ways in which something (object, structure, property, or concept) might be thought to be physically fundamental. On at least one prima facie plausible carving of the notion of fundamentality, one should distinguish between the logically independent notions of ontological fundamentality, nomological fundamentality, and explanatory fundamentality. Something is ontologically fundamental if it is posited by the most fundamental description of the world; it is nomologically fundamental if reference to it is necessary to state the physical laws in some domain; and it is explanatorily fundamental if positing it is necessary in explanation and understanding. It is straightforward to show that the concept of information is not ontologically fundamental (or more cagily put: that there is nothing at all about the successes of quantum information theory that would warrant thinking it to be ontologically fundamental), but the questions of nomological and explanatory fundamentality of information are harder to settle so succinctly.

Week 4 (8 Feb) David Wallace (USC): Why black hole information loss is paradoxical

Abstract: I distinguish between two versions of the black hole information-loss paradox. The first arises from apparent failure of unitarity on the spacetime of a completely evaporating black hole, which appears to be non-globally-hyperbolic; this is the most commonly discussed version of the paradox in the foundational and semipopular literature, and the case for calling it “paradoxical” is less than compelling. But the second arises from a clash between a fully-statistical-mechanical interpretation of black hole evaporation and the quantum-field-theoretic description used in derivations of the Hawking effect. This version of the paradox arises long before a black hole completely evaporates, seems to be the version that has played a central role in quantum gravity, and is genuinely paradoxical. After explicating the paradox, I discuss the implications of more recent work on AdS/CFT duality and on the ‘Firewall paradox’, and conclude that the paradox is if anything now sharper.

Week 5 (15 Feb) Dennis Lehmkuhl (Caltech): The History and Interpretation of Black Hole Solutions

Abstract:The history and philosophy of physics community has spent decades grappling with the interpretation of the Einstein field equations and its central mathematical object, the metric tensor. However, the community has not endeavoured a detailed study of the solutions to these equations. This is all the more surprising as this is where the meat is in terms of the physics: the confirmation of general relativity through the 1919 observation of light being bent by the sun, as well as the derivation of Mercury’s perihelion, both depend much more on the use of the Schwarzschild solution than on the actual field equations. Indeed, Einstein had not yet found the final version of the field equations when he predicted the perihelion of Mercury. The same is true with respect to the recently discovered black holes and gravitational waves: they are, arguably, tests of particular solutions to the Einstein equations and how these solutions are applied to certain observations. Indeed, what is particularly striking is that all the solutions just mentioned are solutions to the vacuum Einstein equations rather than to the full Einstein equations. This is surprising given that black holes are the most massive objects in the universe, and yet they are adequately represented by solutions to the vacuum field equations.

In this talk, I shall discuss the history and the diverse interpretations and applications of three of the most important (classes of) black hole solutions: I will address especially how the free parameters in these solutions were identified as representing the mass, charge and angular momentum of isolated objects, and what kind of coordinate conditions made it possible to apply the solutions in order to represent point particles, stars, and black holes.

Week 6 (22 Feb) No seminar

Week 7 (1 Mar) Carina Prunkl (Oxford): Black Hole Entropy is Entropy and not (necessarily) Information

Abstract:The comparison of geometrical properties of black holes with classical thermodynamic variables reveals surprising parallels between the laws of black hole mechanics and the laws of thermodynamics. Since Hawking’s discovery that black holes when coupled to quantum matter fields emit radiation at a temperature proportional to their surface gravity, the idea that black holes are genuine thermodynamic objects with a well-defined thermodynamic entropy has become more and more popular. Surprisingly, arguments that justify this assumption are both sparse and rarely convincing. Most of them rely on an information-theoretic interpretation of entropy, which in itself is a highly debated topic in the philosophy of physics. Given the amount of disagreement about the nature of entropy and the second law on the one hand, and the growing importance of black hole thermodynamics for the foundations of physics on the other hand, it is desirable to achieve a deeper understanding of the notion of entropy in the context of black hole mechanics. I discuss some of the pertinent arguments that aim at establishing the identity of black hole surface area (times a constant) and thermodynamic entropy and show why these arguments are not satisfactory. I then present a simple model of a Black Hole Carnot cycle to establish that black hole entropy is genuine thermodynamic entropy which does not require an information-theoretic interpretation.

Week 8 (8 Mar) Nicolas Teh (Notre Dame): TBC

MICHAELMAS TERM 2017

Week 2 (October 19) Henrique Gomes, Perimeter Institute, Waterloo.

“New vistas from the many-instant landscape”

Abstract: Quantum gravity has many conceptual problems. Amongst the most well-known is the “Problem of Time”: gravitational observables are global in time, while we would really like to obtain probabilities for processes taking us from an observable at one time to another, later one. Tackling these questions using relationalism will be the preferred strategy during this talk. The ‘relationalist’ approach leads us to shed much redundant information and enables us to identify a reduced configuration space as the arena on which physics unfolds, a goal still beyond our reach in general relativity. Moreover, basing our ontology on this space has far-reaching consequences. One is that it suggests a natural interpretation of quantum mechanics; it is a form of ‘Many-Worlds’ which I have called Many-Instant Bayesianism. Another is that the gravitational reduced configuration space has a rich, highly asymmetric structure which singles out preferred, non-singular and homogeneous initial conditions for a wave-function of the universe, which is yet to be explored.

Week 3 (October 26) Jonathan Halliwell, Imperial College, London

“Comparing conditions for macrorealism: Leggett-Garg inequalities vs no-signalling in time”

Abstract: Macrorealism is the view that a macroscopic system evolving in time possesses definite properties which can be determined without disturbing the future or past state.
I discuss two different types of conditions which were proposed to test macrorealism in the context of a system described by a single dichotomic variable Q.  The Leggett-Garg (LG) inequalities, the most commonly-studied test, are only necessary conditions for macrorealism, but I show that when the four three-time LG inequalities are augmented with a certain set of two-time inequalities also of the LG form, Fine’s theorem applies and these augmented conditions are then both necessary and sufficient. A comparison is carried out with a very different set of necessary and sufficient conditions for macrorealism, namely the no-signaling in time (NSIT) conditions proposed by Brukner, Clemente, Kofler and others, which ensure that all probabilities for Q at one and two times are independent of whether earlier or intermediate measurements are made in a given run, and do not involve (but imply) the LG inequalities. I argue that tests based on the LG inequalities have the form of very weak classicality conditions and can be satisfied, in quantum mechanics, in the face of moderate interference effects, but those based on NSIT conditions have the form of much stronger coherence witness conditions, satisfied only for zero interference. The two tests differ in their implementation of non-invasive measurability so are testing different notions of macrorealism. The augmented LG tests are indirect, entailing a combination of the results of different experiments with only compatible quantities measured in each experimental run, in close analogy with Bell tests, and are primarily tests for macrorealism per se. By contrast the NSIT tests entail sequential measurements of incompatible quantities and are primarily tests for non-invasiveness.

Based on the two papers J.J.Halliwell, Phys.Rev. A93, 022123 (2016); A96, 012121 (2017).

Week 4 (November 2) Sam Fletcher, Dept of Philosophy, University of Minnesota.

“Emergence and scale’s labyrinth”

I give precise formal definitions of a hierarchy of emergence concepts for properties described in models of physical theories, showing how some of these concepts are compatible with reductive (but not strictly deductive) relationships between these theories. Besides applying fruitfully to a variety of physical examples, these concepts do not in general track autonomy or novelty along a single simple dimensional scale such as energy, length, or time, but can instead involve labyrinthine balancing relationships between these scales. This complicates the usual view of emergence as relating linearly (or even partially) ordered levels.

Week 5 (November 9) James Ladyman, Dept of Philosophy, University of Bristol

“Why interpret quantum mechanics?”

Abstract: I discuss recent arguments that QM needs no interpretation, and that it should be understood as not representational. I consider how the interpretation of quantum mechanics relates to various kinds of realism, and the fact that the theory is known not to be a complete theory of the world. I tentatively suggest a position that is sceptical about the way the interpretation of quantum mechanics is often undertaken, in particular of the idea of the ontology of the wavefunction, but stops short of regarding quantum states as not representational.

Week 6 (November 16) Hasok Chang, Department of History and Philosophy of Science, University of Cambridge.

“Beyond truth-as-correspondence: Realism for realistic people”

Abstract: In this paper I present arguments against the epistemological ideal of “correspondence”, namely the deeply entrenched notion that empirical truth consists in the match between our theories and the world. The correspondence ideal of knowledge is not something we can actually pursue, for two reasons: it is difficult to discern a coherent sense in which statements correspond to language-independent facts, and we do not have the kind of independent access to the “external world” that would allow us to check the alleged statement–world correspondence. The widespread intuition that correspondence is a pursuable ideal is based on an indefensible kind of externalist referential semantics. The idea that a scientific theory “represents” or “corresponds to” the external world is a metaphor grounded in other human epistemic activities that are actually representational. This metaphor constitutes a serious and well-entrenched obstacle in our attempt to understand scientific practices, and overcoming it will require some disciplined thinking and hard work. On the one hand, we need to continue with real practices of representation in which correspondence can actually be judged; on the other hand, we should stop the illegitimate transfer of intuitions from those practices over to realms in which there are no representations being made and no correspondence to check.

Week 7 (November 23) Alison Fernandes, Department of Philosophy, University of Warwick.

“The temporal asymmetry of chance”

Abstract: The Second Law of Thermodynamics can be derived from the fact that an isolated system at non-maximal entropy is overwhelmingly likely to increase in entropy over time. Such derivations seem to make ineliminable use of objective worldly probabilities (chances). But some have argued that if the fundamental laws are deterministic, there can be no non-trivial chances (Popper, Lewis, Schaffer). Statistical-mechanical probabilities are merely epistemic, or otherwise less real than ‘dynamical’ chances. Many have also thought that chance is intrinsically temporally asymmetric. It is part of the nature of chance that the past is ‘fixed’, and that all non-trivial chances must concern future events. I’ll argue that it is no coincidence that many have held both views: the rejection of deterministic chance is driven by an asymmetric picture of chance in which the past produces the future. I’ll articulate a more deflationary view, according to which more limited temporal asymmetries of chance reflect contingent asymmetries of precisely the kind reflected in the Second Law. The past can be chancy after all.

Week 8 (November 30) Nancy Cartwright, Department of Philosophy, University of Durham and University of California, San Diego

“What are pragmatic trials in medicine good for?”

Abstract: There is widespread call for increasing use of pragmatic trials in both medicine and social science nowadays. These are randomised controlled trials (RCTs) that are administered in ‘more realistic’ circumstances than standard, i.e. with more realistic treatment/programme delivery (e.g. busier, less well-trained doctors/social workers) and a wider range of recipients (e.g. ones that self select into treatment or have ‘co-morbidities’ or are already subject to a number of other interventions that might interfere with the treatment). Pragmatic trial results are supposed to be more readily ‘generalisable’ than results from those with more rigid protocols.
We argue that this is a mistake. Trials, pragmatic or otherwise, can only provide results about those individuals enrolled in the trial. Anything else requires assumptions from elsewhere, and generally strong ones. Based on a common understanding of what causal principles look like in these domains, this talk explains what results can be well warranted by an RCT and warns against the common advice to take the criteria for admission to a trial to be indicative of where else its results may be expected to hold.
Joint work with Sarah Wieten.

TRINITY TERM 2017

15th June 2017 Harvey Brown (Oxford), “QBism: the ineffable reality behind “ participatory realism””

Abstract: The recent philosophy of Quantum Bayesianism, or QBism, represents an attempt to solve the traditional puzzles in the foundations of quantum theory by denying the objective reality of the quantum state. Einstein had hoped to remove the spectre of nonlocality in the theory by also assigning an epistemic status to the quantum state, but his version of this doctrine was recently proved to be inconsistent with the predictions of quantum mechanics. In this talk, I present plausibility arguments, old and new, for the reality of the quantum state, and expose what I think are weaknesses in QBism as a philosophy of science.

8th June 2017 David Jackson (Independent), “How to build a unified field theory from one dimension”

Abstract: Motivated in part by Kant’s work on the a priori nature of space and time, and in part by the conceptual basis of general relativity, a physical theory deriving from a single temporal dimension will be presented. We describe how the basic arithmetic composition of the real line, representing the one dimension of time, itself incorporates structures that can be interpreted as underpinning both the geometrical form of space and the physical form of matter. This unification scheme has a number of features in common with a range of physical theories based on ‘extra dimensions’ of space, while being heavily constrained in deriving from a single dimension of time. A proposal for combining general relativity with quantum theory in the context of this approach will be summarised, along with the connections made with empirical observations. In addition to extracts from Kant further references to sources in the philosophical literature will be cited, in particular with regard to the relation between mathematical objects and physical structures.

1st June 2017 Jo E. Wolff (KCL), “Quantities – Metaphysical Choicepoints”

Abstract: Beginning from the assumption that quantities are (rich) relational structures, I ask, what kind of ontology arises from attributing this sort of structure to physical attributes. There are three natural questions to ask about relational structures: What are the relations, what are the relata, and what is the relationship between relata and relations? I argue that for quantities, the choicepoints available in response to these questions are:
1) intrinsicalism vs. structuralism
2) substantivialism vs. anti-substantivalism
3) absolutism vs. comparativism
In the remainder of the talk I sketch, which of these choices make for coherent candidate ontologies for quantities.

18th May 2017 Paul Tappenden (Independent), “Quantum fission”.

Abstract: Sixty years on there is still deep division about Everett’s proposal. Some very well informed critics take the whole idea to be unintelligible whilst there are important disagreements amongst supporters. I argue that Everett’s fundamental and radical idea is to do with metaphysics rather than physics: it is to abolish the physically possible/actual dichotomy. I show that the idea is intelligible via a thought experiment involving a novel version of the mind-body relation which I have already used in the defence of semantic internalism.
The argument leads to a fission interpretation of branching rather than a “divergence” interpretation of the sort first suggested by David Deutsch in 1985 and more recently developed in different ways by Simon Saunders, David Wallace and Alastair Wilson. I discuss the two metaphysical problems which fission faces: transtemporal identity and the identification of probability with relative branch measure. And I claim that the Born rule applies transparently if the alternative mind-body relation is accepted. The upshot is that what Wallace calls the Radical View replaces his preferred Conservative View, with the result that there are some disturbing consequences such as inevitable personal survival in quantum Russian roulette scenarios and David Lewis’s suggestion that Everettians should “shake in their shoes”.

11th May 2017 Michela Massimi (Edinburgh), “Perspectival models in contemporary high-energy physics”.

Abstract: In recent times perspectivism has come under attack. Critics have argued that when it comes to modelling, perspectivism is either redundant, or, worse, it leads to a plurality of incompatible or even inconsistent models about the same target system. In this paper, I attend to two tasks. First, I try to get clear about the charge of metaphysical inconsistency that has been levelled against perspectivism and identify some key assumptions behind it. Second, I propose a more positive role for perspectivism in some modelling practices by identifying a class of models, which I call “perspectival models”. I illustrate this class of models with examples from contemporary LHC physics.

4th May 2017 Tushar Menon (Oxford), “Affine Balance: Algebraic functionalism and the ontology of spacetime”.

Abstract: Our two most empirically successful theories, quantum mechanics and general relativity, are at odds with each other when it comes to several foundational issues. The deepest of these issues is also, perhaps, the easiest to grasp intuitively: what is spacetime? Most attempts at theories of quantum gravity do not make it obvious which degrees of freedom are spatiotemporal. In non-general relativistic theories, the matter/spacetime distinction is adequately tracked by the dynamical/non-dynamical object distinction. General relativity is different, because spacetime, if taken to be jointly, but with some redundancy, represented by a smooth manifold and a metric tensor field, is not an immutable, inert, external spectator. Our dynamical/non-dynamical distinction appears no longer to do the work for us; we appear to need something else. In the first part of this talk, I push back against the idea that the dynamical/non-dynamical distinction is doomed. I motivate a more general algebraic characterisation of spacetime based on Eleanor Knox’s spacetime functionalism, and the Helmholtzian notion of free mobility. I argue that spacetime is most usefully characterised by its (local) affine structure.

In the second part of this talk, I consider the debate between Brown and Pooley on the one hand and Janssen and Balashov on the other, about the direction of the arrow of explanation in special relativity. Characterising spacetime using algebraic functionalism, I demonstrate that only Brown’s position is neutral on the substantivalism–relationalism debate. This neutrality may prove to be highly desirable in an interpretation of spacetime that one hopes will generalise to theories of quantum gravity—it seems like poor practice to impose restrictions on an acceptable quantum theory of spacetime based on metaphysical prejudices or approximately true effective field theories. The flexibility of Brown’s approach affords us a theory-dependent a posteriori identification of spacetime, and arguably counts in its favour. I conclude by gesturing towards how this construction might be useful in extending Brown’s view to theories of quantum gravity.

27th April 2017 Peter Hylton (UIC) “Analyticity, yet again”.

Abstract: Although Quine became famous for having rejected the analytic-synthetic distinction, he actually accepted it for the last quarter century of his philosophical career. Yet his doing so makes no difference to his other views. In this talk, I press the question ‘Why not?’, in the hope of gaining insight into Quine’s views, and especially his differences with Carnap. I contrast Quine’s position not only with Carnap’s but also with those of Putnam, as represented in his paper ‘The Analytic and the Synthetic’. Putnam there puts forward an answer to the ‘Why not?’ question which is, I think, fairly widely accepted, and perhaps taken to be Quine’s answer as well—wrongly so taken, I claim.

9th March 2017 Michael Hicks (Physics, Oxford), “Explanatory (a)symmetries and Humean laws”.

Abstract: Recently, Lange (2009) has argued that some physical principles are explanatorily prior to others. Lange’s main examples are symmetry principles, which he argues explain both conservation laws–through Noether’s Theorem–and features of dynamic laws–for example, the Lorentz invariance of QFT. Lange calls these “meta-laws” claims that his account of laws, which is built around the counterfactual stability of groups of statements, can capture the fact that these govern or constrain first-order laws, whereas other views, principally Humean views, can’t. After reviewing the problem Lange presents, I’ll show how the explanatory asymmetry between laws he describes follows naturally on a Humean understanding of what laws are–particularly informative summaries. The Humean should agree with Lange that symmetry principles are explanatorily prior to both conservation laws and dynamic theories like QFT; however, I’ll argue that Lange is wrong to consider these principles “meta-laws” which in some way govern first-order laws, and I’ll show that on the Humean view, the explanation of these two sorts of laws from symmetry principles is importantly different.

2nd March 2017 Ronnie Hermens (Philosophy, Groningen), “How ψ-ontic are ψ-ontic models?”.

Abstract: Ψ-ontology theorems show that in any ontic model that is able to reproduce the predictions of quantum mechanics, the quantum state must be encoded by the ontic state. Since the ontic state determines what is real, and it determines the quantum state, the quantum state must be real. But how does this precisely work in detail, and what does the result imply for the status of the quantum state in ψ-ontic models? As a test case scenario I will look at the ontic models of Meyer, Kent and Clifton. Since these models are able to reproduce the predictions of quantum mechanics, they must be ψ-ontic. On the other hand, quantum states play no role whatsoever in the construction of these models. Thus finding out which ontic state belongs to which quantum state is a non-trivial task. But once that is done, we can ask: does the quantum state play any explanatory role in these models, or is the fact that they are ψ-ontic a mere mathematical nicety?

23rd February 2017 Simon Saunders (Philosophy, Oxford), “Quantum monads”.

Abstract: The notion of object (and with it ontology) in the foundations of quantum mechanics has been made both too easy and too hard: too easy, because particle distinguishability, and with it the use of proper names, is routinely assumed; too hard, because a number of metaphysical demands have been made of it (for example, in the notion of ‘primitive ontology’ in the writings of Shelly Goldstein and his collaborators). The measurement problem is also wrapped up with it. I shall first give an account of quantum objects adequate to the thin sense required of quantification theory (in the tradition of Frege and Quine); I then consider an alternative, much thicker notion that is strongly reminiscent of Leibniz’s monadology. Both apply to the Everett interpretation and to dynamical collapse theories (sans primitive ontology).

16th February 2017 Steven Balbus (Physics, Oxford), “An anthropic explanation for the nearly equal angular diameters of the Sun and Moon”.

Abstract: The very similar angular sizes of the Sun and Moon as subtended at the Earth is generally portrayed as coincidental. In fact, close angular size agreement is a direct and inevitable mathematical consequence of even roughly comparable lunar and solar tidal amplitudes. I will argue that the latter was a biological imperative for the evolution of land vertebrates and can be understood on the basis of anthropic arguments. Comparable tidal amplitudes from two astronomical sources, with close but distinct frequencies, leads to strongly modulated forcing: in essence spring and neap tides. This appearance of this surely very rare tidal pattern must be understood in the context of paleogeography and biology of the Late Devonian period. Two great land masses were separated by a broad opening tapering to a very narrow, shallow-sea strait. The combination of this geography and modulated tidal forces would have been conducive to forming a rich inland network of shallow but transient (and therefore isolating) tidal pools at an epoch when fishy tetrapods were evolving and acquiring land navigational skills. I will discuss the recent fossil evidence showing that important transitional species lived in habitats strongly influenced by intermittent tides. It may be that any planet capable of harbouring a contemplative species displays a moon in its sky very close in angular diameter to that of its sun.

9th February 2017 Alastair Wilson (Philosophy, Birmingham), “How multiverses might undercut the fine-tuning argument”.

Abstract: In the context of the probabilistic fine-tuning argument that moves from the fragility of cosmological parameters with respect to life to the existence of a divine designer, appealing to the existence of a multiverse has in general seemed problematically ad hoc. The situation looks rather different, though, if there is independent evidence from physics for a multiverse. I will argue that independently-motivated multiverses can be undercutting defeaters for the fine-tuning argument; but whether the argument is indeed undercut still depends on open questions in fundamental physics and cosmology. I will also argue that Everettian quantum mechanics opens up new routes to undercutting the fine-tuning argument, although by itself it is insufficient to do so.

26th January 2017 Antony Eagle (Philosophy, Adelaide), “Quantum location”.

Abstract: Many metaphysicians are committed to the existence of a location relation between material objects and spacetime, useful in characterising debates in the metaphysics of persistence and time, particularly in the context of trying to map ordinary objects into models of relativity theory. Relatively little attention has been paid to location in quantum mechanics, despite the existence of a position observable in QM being one of the few things metaphysicians know about it. I want to explore how the location relation(s) postulated by metaphysicians might be mapped onto the framework of QM, with particular reference to the idea that there might be such a thing as being indeterminately located.

19th January 2017 Emily Adlam (DAMPT, Cambridge), “Quantum mechanics and global determinism”.

Abstract: We propose that the information-theoretic features of quantum mechanics are perspectival effects which arise because experiments on local variables can only uncover a certain subset of the correlations exhibited by an underlying deterministic theory. We show that the no-signalling principle, information causality, and strong subadditivity can be derived in this way; we then use our approach to propose a new resolution of the black hole information paradox.

24 Nov 2016 David Glick (Philosophy, Oxford), “Swapping Something Real: Entanglement Swapping and Entanglement Realism”.

Abstract: Experiments demonstrating entanglement swapping have been alleged to challenge realism about entanglement. Seevinck (2006) claims that entanglement “cannot be considered ontologically robust” while Healey (2012) claims that entanglement swapping “undermines the idea that ascribing an entangled state to quantum systems is a way of representing some new, non-classical, physical relation between them.” My aim in this paper is to show that realism is not threatened by the possibility of entanglement swapping, but rather, should be informed by the phenomenon. I argue—expanding the argument of Timpson and Brown (2010)—that ordinary entanglement swapping cases present no new challenges for the realist. With respect to the delayed-choice variant discussed by Healey, I claim that there are two options available to the realist: (a) deny these are cases of genuine swapping (following Egg (2013)) or (b) allow for the existence of entanglement relations between timelike separated regions. This latter option, while radical, is not incoherent and has been suggested in quite different contexts. While I stop short of claiming that the realist must take this option, doing so allows one to avoid certain costs associated with Egg’s account. I conclude by noting several important implications of entanglement swapping for how one thinks of entanglement relations more generally.

17 Nov 2016 Jim Weatherall (UC Irvine),”On Stuff: The Field Concept in Classical Physics”.

Abstract: Discussions of physical ontology often come down to two basic options. Either the basic physical entities are particles, or else they are fields. I will argue that, in fact, it is not at all clear what it would mean to say that the world consists of fields. Speaking classically (i.e., non-quantum-ly), there are many different sorts of thing that go by the name “field”, each with different representational roles. Even among those that have some claim to being “fundamental” in the appropriate sense, it does not seem that a single interpretational strategy could apply in all cases. I will end by suggesting that standard strategies for constructing quantum theories of fields are not sensitive to the different roles that “fields” can play in classical physics, which adds a further difficulty to interpreting quantum field theory. Along the way, I will say something about an old debate in the foundations of relativity theory, concerning whether the spacetime metric is a “geometrical” or “physical” field. The view I will defend is that the metric is much like the electromagnetic field: geometrical!

10 Nov 2016 Lina Jansson (Nottingham), ‘Newton’s Methodology Meets Humean Supervenience about Laws of Nature’.

Abstract: Earman and Roberts [2005a,b] have argued for Humean supervenience about laws of nature based on an argument from epistemic access. In rough outline, their argument relies on the claim that if Humean supervenience is false, then we cannot have any empirical evidence in favour of taking a proposition to be a law of nature as opposed to merely accidentally true. I argue that Newton’s methodology in the Principia provides a counterexample to their claim. In particular, I argue that the success or failure of chains of subjunctive reasoning is empirically accessible, and that this provides a way of gaining empirical evidence for or against a proposition being a law of nature (even under the assumption that Humean supervenience fails).

27 Oct 2016 Ryan Samaroo (Bristol), “The Principle of Equivalence is a Criterion of Identity”.

Abstract: In 1907 Einstein had an insight into gravitation that he would later refer to as ‘the happiest thought of my life’. This is the hypothesis, roughly speaking, that bodies in free fall do not ‘feel’ their own weight. This is what is formalized in ‘the equivalence principle’. The principle motivated a critical analysis of the Newtonian and 1905 inertial frame concepts, and it was indispensable to Einstein’s argument for a new concept of inertial motion. A great deal has been written about the equivalence principle. Nearly all of this work has focused on the content of the principle, but its methodological role has been largely neglected. A methodological analysis asks the following questions: what kind of principle is the equivalence principle? What is its role in the conceptual framework of gravitation theory? I maintain that the existing answers are unsatisfactory and I offer new answers.

20 Oct 2016 Niels Martens (Oxford, Philosophy), “Comparativism about Mass in Newtonian Gravity”.

Abstract: Absolutism about mass asserts that facts about mass ratios are true in virtue of intrinsic masses. Comparativism about mass denies this. I present and dismiss Dasgupta’s (2013) analysis of his recent empirical adequacy argument in favour of comparativism—in the context of Newtonian Gravity. I develop and criticise two new versions of comparativism. Regularity Comparativism forms a liberalisation of Huggett’s Regularity Relationalism (2006), which uses the Mill-Ramsey-Lewis Best System’s Account to respond to Newton’s bucket argument in the analogous relationalism-substantivalism debate. To the extent that this approach works at all, I argue that it works too well: it throws away the massive baby with the bathwater. A Machian flavoured version of comparativism is more promising. Although it faces no knock-down objection, it is not without its own problems though.

13 Oct 2016 David Wallace (USC, Philosophy), “Fundamental and emergent geometry in Newtonian gravity”.

Abstract: Using as a starting point recent and apparently incompatible conclusions by Simon Saunders (Philosophy of Science 80 (2013) pp.22-48) and Eleanor Knox (British Journal for the Philosophy of Science 65 (2014) pp.863-880), I revisit the question of the correct spacetime setting for Newtonian physics. I argue that understood correctly, these two theories make the same claims both about the background geometry required to define the theory, and about the inertial structure of the theory. In doing so I illustrate and explore in detail the view — espoused by Knox, and also by Harvey Brown (Physical Relativity, OUP 2005) — that inertial structure is defined by the dynamics governing subsystems of a larger system. This clarifies some interesting features of Newtonian physics, notably (i) the distinction between using the theory to model subsystems of a larger whole and using it to model complete Universes, and (ii) the scale-relativity of spacetime structure.

19 May 2016 Eleanor Knox (KCL, Philosophy), “Novel Explanation and the Emergence of Phonons”.

Abstract: Discussions of emergence in the philosophy of physics literature often emphasise the role of asymptotic limits in understanding the novelty of emergent phenomena while leaving the nature of the novelty in question unexplored. I’ll put forward an account of explanatory novelty that can accommodate examples involving asymptotic limits, but also applies in other cases. The emergence of phonons in a crystal lattice will provide an example of a description with novel explanatory power that does not depend on asymptotic limits for its novelty. The talk is based on joint work with Alex Franklin.

12th May 2016 Yvonne Geyer (Oxford, Maths), “Rethinking Quantum Field Theory: Traces of String Theory in Yang-Mills and Gravity”.

Abstract: A multitude of recent developments point towards the need for a different understanding of Quantum Field Theories. After a general introduction, I will focus on one specific example involving one of the most natural and fundamental observables; the scattering amplitude. In Yang-Mills theory and Einstein gravity, scattering amplitudes exhibit a simplicity that is completely obscured by the traditional approach to Quantum Field Theories, and that is remarkably reminiscent of the worldsheet models describing string theory. In particular, this implies that – without additional input – the theories describing our universe, Yang-Mills theory and gravity, exhibit traces of string theory.

28th April 2016 Roman Frigg (LSE, Philosophy), “Further Rethinking Equilibrium”.

Abstract: In a recent paper we proposed a new definition of Boltzmannian equilibrium and showed that in the case of deterministic dynamical systems the new definition implies the standard characterisation but without suffering from its well-known problems and limitations. We now generalise this result to stochastic systems and show that the same implication holds. We then discuss an existence theorem for equilibrium states and illustrate with a number of examples how the theorem works. Finally, fist steps towards understanding the relation between Boltzmannian and Gibbsian equilibrium are made.

25 Feb Stephen J. Blundell (Oxford, Physics), ‘Emergence, causation and storytelling: condensed matter physics and the limitations of the human mind’

Abstract

The physics of matter in the condensed state is concerned with problems in which the number of constituent particles is vastly greater than can be comprehended by the human mind. The physical limitations of the human mind are fundamental and restrict the way in which we can interact with and learn about the universe. This presents challenges for developing scientific explanations that are met by emergent narratives, concepts and arguments that have a nonEtrivial relationship to the underlying microphysics. By examining examples within condensed matter physics, and also from cellular automata, I show how such emergent narratives efficiently describe elements of reality.

18 Feb Jean-Pierre Llored (University of Clermont-Ferrand), ‘From quantum physics to quantum chemistry’.

Abstract:

The first part, which is mainly anthropological, summarizes the results of a survey that we carried out in several research laboratories in 2010. Our aims were to understand what quantum chemists currently do, what kind of questions they ask, and what kind of problems they have to face when creating new theoretical tools both for understanding chemical reactivity and predicting chemical transformations.

The second part, which is mainly historical, highlights the philosophical underpinnings that structure the development of quantum chemistry from 1920 to nowadays. In so doing, we will discuss chemical modeling in quantum chemistry, and the different strategies used in order to define molecular features using atomic ones and the molecular surroundings at the same time. We will show how computers and new laboratories emerged simultaneously, and reshaped the culture of quantum chemistry. This part goes on to describe how the debate between ab initio and semi-empirical methods turned out to be highly controversial because of underlying scientific and metaphysical assumptions about, for instance, the nature of the relationships between science and the possibility for human knowledge to reach a complete description of the world.

The third and last part is about the philosophical implications for the study of quantum chemistry and that of ‘quantum sciences’ at large. It insists on the fact that the history of quantum chemistry is also a history of the attempts of chemists to establish the autonomy of their theories and methods with respect to physical, mathematical, and biological theories. According to this line of argument, chemists gradually proposed new concepts in order to circumvent the impossibility to perform full analytical calculations and to make the language of classical structural chemistry and that of quantum chemistry compatible. Among different topics, we will query the meaning of a chemical bond, the impossibility to deduce a molecular shape from the Schrödinger equation, the way quantum chemistry is involved in order to explain the periodic table, and the possibility to go beyond the Born-Oppenheimer approximation. We would like to show that quantum chemistry is neither physics nor chemistry nor applied mathematics, and that philosophical debates which turned out to be relevant in quantum physics are not necessarily so in quantum chemistry, whereas other philosophical questions arise…

11th Feb David Wallace (Oxford, Philosophy) , ‘Who’s afraid of coordinate systems?’.

Abstract:

Coordinate-based approaches to physical theories remain standard in mainstream physics but are largely eschewed in foundational discussion in favour of coordinate-free differential-geometric approaches. I defend the conceptual and mathematical legitimacy of the coordinate-based approach for foundational work. In doing so, I provide an account of the Kleinian conception of geometry as a theory of invariance under symmetry groups; I argue that this conception continues to play a very substantial role in contemporary mathematical physics and indeed that supposedly ‘coordinate-free’ differential geometry relies centrally on this conception of geometry. I discuss some foundational and pedagogical advantages of the coordinate-based formulation and briefly connect it to some remarks of Norton on the historical development of geometry in physics during the establishment of the general theory of relativity.

21 Jan Philipp Roser (Clemson), ‘‘Time and York time in quantum theory’.

Abstract:

Classical general relativity has no notion of a physically meaningful time parameter and one is free to choose one’s coordinates at will. However, when attempting to quantise the theory this freedom leads to difficulties, the notorious `problem of time’ of canonical quantum gravity. One way to overcome this obstacle is the identification of a physically fundamental time parameter. Interestingly, although purely aesthetic at the classical level, different choices of time parameter may in principle lead to different quantum phenomenologies, as I will illustrate with a simple model. This means that an underlying physically fundamental notion of time may (to some extent) be detectable via quantum effects.

For various theoretical reasons one promising candidate for a physical time parameter is `York time’, named after James York and his work on the initial-value problem of general relativity, where its importance first became apparent. I will derive the classical and quantum dynamics with respect to York time for certain cosmological models and discuss some of the unconventional structural features of the resulting quantum theory.

3 Dec Thomas Moller-Nielsen (Oxford), “Symmetry and the Interpretation of Physical Theories”

Abstract:

In this talk I examine two (putative) ways in which symmetries can be used as tools for physical theory interpretation. First, I examine the extent to which symmetries can be used as a guide to a theory’s ideology: that is, as a means of determining which quantities are real, according to the theory. Second, I examine the extent to which symmetries can be used as a guide to a theory’s ontology: that is, as a means of determining which objects are real, according to the theory. I argue that symmetries can only legitimately be used in the first, but not the second, sense.

26 Nov Ellen Clarke (All Souls), “Biological Ontology”.

Abstract:

All sciences invent kind concepts: names for categories that gather particulars together according to their possession of some scientifically interesting properties. But kind concepts must be well-motivated: they need to do some sort of work for us. I show how to define one sort of scientific concept – that of the biological individual, or organism – so that it does plenty of work for biology. My view understands biological individuals as defined by the process of evolution by natural selection. I will engage in some speculation about how the situation compares in regard to other items of scientific ontology.19 November Dan Bedingham (Oxford) “Dynamical Collapse of the Wavefunction and Relativity”.

19 Nov Dan Bedingham (Oxford), “Dynamical collapse of the wavefunction and relativity”

Abstract:

When a collapse of the wave function takes place it has an instantaneous effect over all space. One might then assume that a covariant description is not possible since a collapse whose effects are simultaneous in one frame of reference would not have simultaneous effects in a boosted frame. I will show, however, that in fact a consistent covariant picture emerges in which the collapsing wave function depends on the choice of foliation of space time, but that suitably defined local properties are unaffected by this choice. The formulation of a covariant description is important for models attempting to describe the collapse of wave function as a dynamical process. This is a very direct approach to solving the quantum measurement problem. It involves simply giving the wave function the stochastic dynamics that it has in practice. We present some proposals for relativistic versions of dynamical collapse models.

5 November Joseph Melia (Oxford) “Haecceitism, Identity and Indiscernibility: (Mis-)Uses of Modality in the Philosophy of Physics”

Abstract:

I examine a number of arguments involving modality and identity in the Philosophy of Physics. In particular, (a) Wilson’s use of Leibniz’ law to argue for emergent entities; (b) the implications of anti-haecceitism for the Hole argument in GR and QM; (c) the proposal to “define” or “ground” or “account” for identity via some version of Principle of the Identity of Indiscernibles or the Hilbert-Bernays formula.

Against (a) I argue that familiar problems with applications of Leibniz’ law in modal contexts block the argument for the existence of emergent entities;

On (b), I argue that (i) there are multiple and incompatible definitions of haecceitism at play in the literature; (ii) that, properly understood, haecceitism *is* a plausible position; indeed, even supposedly mysterious haecceities do not warrant the criticism of obscurity they have received; (iii) we do better to solve the Hole argument by other means than a thesis about the range and variety of possibilities.

On (c), I argue that recent attempts to formulate a principle of PII fit to serve as a definition of identity are either trivially true, or must draw distinctions between different kinds of properties that are problematic: better to accept identity as primitive.

Some relevant papers/helpful reading (I will not, of course, assume familiarity with these papers)
J. Ladyman: `On the Identity and Diversity of Objects in a Structure.’ Proc. Aristotelian Supp Soc. (2007).
D. Lewis: `On the Plurality of Worlds’, Chp.4. (1986)
O. Pooley: `Points, Particles and Structural Realism’, in Rickles, French and Saatsi, `The Structural Foundations of Quantum Gravity.’ (2006)
S. Saunders: `Are Quantum Particles Objects?’ Analysis (2006)
J. Wilson: `Non-Reductive Physicalism and Degrees of Freedom’, BJPS (2010)

29 October Chiara Marletto (Oxford, Materials), “Constructor theory of information (and its implications for our understanding of quantum theory)”.

Abstract:

Constructor Theory is a radically new mode of explanation in fundamental physics. It demands a local, deterministic description of physical reality – expressed exclusively in terms of statements about what tasks are possible, what are impossible, and why. This mode of explanation has recently been applied to provide physical foundations for the theory of information – expressing, as conjectured physical principles, the regularities of the laws of physics necessary for there to be what has been so far informally called ‘information’. In constructor theory, one also expresses exactly the relation between classical information and the so-called ‘quantum information’ – showing how properties of the latter arise from a single, constructor-theoretic constraint. This provides a unified conceptual basis for the quantum theory of information (which was previously lacking one qua theory of information). Moreover, the arising of quantum-information like properties in a deterministic, local framework also has implications for the understanding of quantum theory, and of its successors.

22 October Bryan Roberts (LSE) “The future of the weakly interacting arrow of time”.

Abstract:

This talk discusses the evidence for time asymmetry in fundamental physics. The main aim is to propose some general templates characterising how time asymmetry can be detected among weakly interacting particles. We will then step back and evaluate how this evidence bears on time asymmetry in future physical theories beyond the standard model.

15 October Oscar Dahlsten (Oxford Physics) “The role of information in work extraction”.

Abstract:

Since Maxwell’s daemon it has been known that extra information can give more
work. I will discuss how this can be made concrete and quantified. I will focus on
so-called single-shot statistical mechanics. There one can derive expressions for the
maximum work one can extract from a system given one’s information. Only one property
of the state one assigns to the system matters: the entropy. There are subtleties, including
which entropy to use. I will also discuss the relation to fluctuation theorems, and our recent
paper on realising a photonic Maxwell’s daemon.

Some references, I will certainly not assume you have looked at them:
arXiv:0908.0424 The work value of information, Dahlsten, Renner, Rieper and Vedral
arXiv:1009.1630 The thermodynamic meaning of negative entropy, del Rio, Aaberg, Renner, Dahlsten and Vedral
arXiv:1207.0434 A measure of majorisation emerging from single-shot statistical mechanics, Egloff, Dahlsten, Renner, Vedral
arXiv:1409.3878 Introducing one-shot work into fluctuation relations, Yunger Halpern, Garner, Dahlsten, Vedral
arXiv:1504.05152 Equality for worst-case work at any protocol speed, Dahlsten, Choi, Braun, Garner, Yunger Halpern, Vedral
arxiv:1510.02164 Photonic Maxwell’s demon, Vidrighin, Dahlsten, Barbieri, Kim, Vedral and Walmsley”

Thursday 15 October, 16:30. Lecture Room, Radcliffe Humanities, Woodstock Rd.

11 June Tim Pashby (University of Southern California)
‘Schroedinger’s Cat: It’s About Time (Not Measurement)’

Abstract: I argue for a novel resolution of Schroedinger’s cat paradox by paying particular attention to the role of time and tense in setting up the problem. The quantum system at the heart of the paradoxical situation is an unstable atom, primed for indeterministic decay at some unknown time. The conventional account gives probabilities for the result of instantaneous measurements and leads to the unacceptable conclusion that the cat can neither be considered alive nor dead until the moment the box is opened (at a time of the experimenter’s choosing). To resolve the paradox I reject the status of the instantaneous quantum state as `truthmaker’ and show how a quantum description of the situation can be given instead in terms of time-dependent chance propositions concerning the time of decay, without reference to measurement.

The conclusions reached in the case of Schroedinger’s cat may be generalized throughout quantum mechanics with the means of event time observables (interpreted as conditional probabilities), which play the role of the time of decay for an arbitrary system. Conventional quantum logic restricts its attention to the lattice of projections, taken to represent possible properties of the system. I argue that event time observables provide a compelling reason to look beyond the lattice of projections to the algebra of effects, and suggest an interpretation in which propositions are made true by events rather than properties. This provides the means to resolve the Wigner’s friend paradox along similar lines.

4th June Neil Dewar (Oxford)
‘Symmetry and Interpretation: or, Translations and Translations’

Abstract: There has been much discussion of whether we should take (exact) symmetries of a physical theory to relate physically equivalent states of affairs, and – if so – what it is that justifies us in so doing. I argue that we can understand the propriety of this move in essentially semantic terms: namely, by thinking of a symmetry transformation as a means of translating a physical theory into itself. To explain why symmetry transformations have this character, I’ll first look at how notions of translation and definition are dealt with in model theory. Then, I’ll set up some analogies between the model-theoretic formalism and the formalism of differential equations, and show how the relevant analogue of self-translation is a symmetry transformation. I conclude with some remarks on how this argument bears on debates over theoretical equivalence.

28th MayGeorge Ellis (Cape Town)
‘On the crucial role of top-down causation in complex systems’

Abstract: It will be suggested that causal influences in the real world occurring on evolutionary, developmental, and functional timescales are characterized by a combination of bottom up and top down effects. Digital computers give very clear exemplars of how this happens. There are five different distinct classes of top down effects, the key one leading to the existence of complex systems being adaptive selection. The issue of how there can be causal openness at the bottom allowing this to occur will be discussed. The case will be made that while bottom-up self-assembly can attain a certain degree of complexity, truly complex systems such as life can only come into being if top-down processes come into play in addition to bottom up processes. They allow genuine emergence to occur, based in multiple realisability at lower levels of higher level structures and functions.

21 May
Francesca Vidotto (Radboud University, Nijmegen)

‘Relational ontology from General Relativity and Quantum Mechanics’

Abstract: Our current most reliable physical theories, General Relativity and Quantum Mechanics, point both towards a relational description of reality. General Relativity builds up the spacetime structure from the notion of contiguity between dynamical objects. Quantum Mechanics describes how physical systems affect one another in the course of interactions. Only local interactions define what exists, and there is no meaning in talking about entities but in terms of local interactions.

14 May
Harvey Brown (Oxford) and Chris Timpson (Oxford)

‘Bell on Bell’s theorem: the changing face of nonlocality’

Between 1964 and 1990, the notion of nonlocality in Bell’s papers underwent a profound change as his nonlocality theorem gradually became detached from quantum mechanics, and referred to wider probabilistic theories involving correlations between separated beables. The proposition that standard quantum mechanics is itself nonlocal (more precisely, that it violates ‘local causality’) became divorced from the Bell theorem per se from 1976 on, although this important point is widely overlooked in the literature. In 1990, the year of his death, Bell would express serious misgivings about the mathematical form of the local causality condition, and leave ill-defined the issue of the consistency between special relativity and violation of the Bell-type inequality. In our view, the significance of the Bell theorem, both in its deterministic and stochastic forms, can only be fully understood by taking into account the fact that a fully Lorentz-covariant version of quantum theory, free of action-at-a-distance, can be articulated in the Everett interpretation.

7 May Mauro Dorato (Rome)
‘The passage of time between physics and psychology ‘

Abstract: The three main aims of my paper are

To defend a minimalistic theory of objective becoming that takes STR and GTR at face value;
To bring to bear relevant neuro-psychological data in support of 1;
To combine 1 and 2 to try to explain with as little metaphysics as possible three key features of our experience of passage, namely:
1. Our untutored belief in a cosmic extension of the now (leading to postulate privileged frames and presentism;
2. The becoming more past of the past (leading to Skow’s 2009 moving spotlight, branching spacetimes)
3. The fact that our actions clearly seem to bringing new events into being (Broad 1923, Tooley 1997, Ellis 2014)

26 February James Ladyman (Bristol)

“Do local symmetries have ‘direct empirical consequences’?”

Abstract: Hilary Greaves and David Wallace argue that, contrary to the widespread view of philosophers of physics, local symmetries have direct empirical consequences. They do this by showing that there are `Galileo’s Ship Scenarios’ in theories with local symmetries. In this paper I will argue that the notion of `direct empirical consequences’ is ambiguous and admits of two kinds of precisification. Greaves and Wallace do not purport to show that local symmetries have empirical consequences in the stronger of the two senses, but I will argue that it is the salient one. I will then argue that they are right to focus on Galileo’s Ship Scenarios, and I will offer a characterisation of the form of such arguments from symmetries to empirical consequences. I will then discuss how various examples relate to this template. I will then offer a new argument in defence of the orthodoxy that direct empirical consequences do not depend on local symmetries.

19 February David Wallace (Oxford):

“Fields as Bodies: a unified treatment of spacetime and gauge symmetry”

Abstract: Using the parametrised representation of field theory (in which the location in spacetime of a part of a field is itself represented by a map from the base manifold to Minkowski spacetime) I demonstrate that in both local and global cases, internal (Yang-Mills-type) and spacetime (Poincare) symmetries can be treated precisely on a par, so that gravitational theories may be regarded as gauge theories in a completely standard sense.

12 February Erik Curiel (Munich)

“Problems with the interpretation of energy conditions in general relativity”

An energy condition, in the context of a wide class of spacetime theories (including general relativity), is, crudely speaking, a relation one demands the stress-energy tensor of matter satisfy in
order to try to capture the idea that “energy should be positive”. The remarkable fact I will discuss is that such simple, general, almost trivial seeming propositions have profound and far-reaching import for our understanding of the structure of relativistic spacetimes. It is therefore especially surprising when one also learns that we have no clear understanding of the nature of these conditions, what theoretical status they have with respect to fundamental physics, what epistemic status they may have, when we should and should not expect them to be satisfied, and even in many cases how they and their consequences should be interpreted physically. Or so I shall argue, by a detailed analysis of the technical and conceptual character of all the standard conditions used in physics today, including examination of their consequences and the circumstances in which they are believed to be violated in the actual universe.

22nd January Jonathan Halliwell (Imperial College London):

“Negative Probabilities, Fine’s Theorem and Quantum Histories”

Abstract: Many situations in quantum theory and other areas of physics lead to quasi-probabilities which seem to be physically useful but can be negative. The interpretation of such objects is not at all clear. I argue that quasi-probabilities naturally fall into two qualitatively different types, according to whether their non-negative marginals can or cannot be matched to a non-negative probability. The former type, which we call viable, are qualitatively similar to true probabilities, but the latter type, which we call non-viable, may not have a sensible interpretation. Determining the existence of a probability matching given marginals is a non-trivial question in general. In simple examples, Fine’s theorem indicates that inequalities of the Bell and CHSH type provide criteria for its existence. A simple proof of Fine’s theorem is given. The results have consequences for the linear positivity condition of Goldstein and Page in the context of the histories approach to quantum theory. Although it is a very weak condition for the assignment of probabilities it fails in some important cases where our results indicate that probabilities clearly exist. Some implications for the histories approach to quantum theory are discussed.
4 December : Tony Sudbery

“The logic of the future in the Everett-Wheeler understanding of quantum theory”

Abstract: I discuss the problems of probability and the future in the Everett-Wheeler understanding of quantum theory. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. I construct a lattice of tensed propositions, with truth values in the interval [0, 1], and derive logical properties of the truth values given by the usual quantum-mechanical formula for the probability of histories. I argue that with this understanding, Everett-Wheeler quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

27 November : Owen Maroney

“How epistemic can a quantum state be?”

Abstract: The “psi-epistemic” view is that the quantum state does not represent a state of the world, but a state of knowledge about the world. It draws its motivation, in part, from the observation of qualitative similarities between characteristic properties of non-orthogonal quantum wavefunctions and between overlapping classical probability distributions. It might be suggested that it gives a natural explanation for these properties, which seem puzzling for the alternative “psi-ontic” view. However, for two key similarities, quantum state overlap and quantum state discrimination, it turns out that the psi-epistemic view cannot account for the values shown by quantum theory, and for a wide range of quantum states must rely on the same supposedly puzzling explanations as the “psi-ontic” view.

20 November : Boris Zilber

“The semantics of the canonical commutation relations”

Abstract: I will argue that the canonical commutation relations and the way of calculating with those discovered in the 1920th is in essence a syntactic reflection of a world the semantics of which is still to be reconstructed. The same can be said about the calculus of Feynman integrals. Similar developments have been taking place in pure mathematics since the 1950s in the form of Grothendieck’s schemes and the formalism of non-commutative geometry. I will report on some progress of reconstructing the missing semantics. In particular, for the canonical commutation relations it leads to a theory of representation in finite-dimensional “algebraic Hilbert spaces” which in the limit look rather similar, although not the same, as conventional Hilbert spaces.

13 November 1st BLOC Seminar, KCL, London : Huw Price

“Two Paths to the Paris Interpretation”

Abstract: In 1953 de Broglie’s student, Olivier Costa de Beauregard, raised what he took to be an objection to the EPR argument. He pointed out that the EPR assumption of Locality might fail, without action-at-a-distance, so long as the influence in question is allowed to take a zigzag path, via the past lightcones of the particles concerned. (He argued that considerations of time-symmetry counted in favour of this proposal.) As later writers pointed out, the same idea provides a loophole in Bell’s Theorem, allowing a hidden variable theory to account for the Bell correlations, without irreducible spacelike influence. (The trick depends on the fact that retrocausal models reject an independence assumption on which Bell’s Theorem depends, thereby blocking the derivation of Bell’s Inequality.) Until recently, however, it seems to have gone unnoticed that there is a simple argument that shows that the quantum world must be retrocausal, if we accept three assumptions (one of them time-symmetry) that would have all seemed independently plausible to many physicists in the years following Einstein’s 1905 discovery of the quantisation of light. While it is true that later developments in quantum theory provide ways of challenging these assumptions – different ways of challenging them, for different views of the ontology of the quantum world – it is interesting to ask whether this new argument provides a reason to re-examine the Costa de Beauregard’s ‘Paris interpretation’.

6 November : Vlatko Vedral

“Macroscopicity”

ABSTRACT: We have a good framework for how to quantify entanglement based, broadly speaking, on two different ideas. One is the fact that local operations and classical communications (LOCCs) do not increase entanglement and hence introduce a natural ordering on the set of entangled states. The other one is inspired by the mean-field theory and quantifies entanglement of a state by how difficult it is to approximate it with disentangled states (the two, while not identical, lead frequently to the same measures). Interestingly, neither of these captures the notion of “macroscopicity” which ask what states are very quantum and macroscopic at the same time. Here the GHZ states win as the ones with the highest macroscopicity, however, they are not highly entangled as far as either the LOCCs or the mean-field theory point of view. I discuss different ways of quantifying macroscopicity and exemplify them with a range of quantum experiments producing different many-body states (GHZ, and general GHZ states, cluster states, topological states). And the winner for producing the highest degree of macroscopicity is…

30 October : David Wallace

“How not to do the metaphysics of quantum mechanics”

Abstract: Recent years have seen an increasing interest in the metaphysics of quantum theory. While welcome, this trend has an unwelcome side effect: an inappropriate (and often unknowing) identification of quantum theory in general with one particular brand of quantum theory, namely the nonrelativistic mechanics of finitely many point particles. In this talk I’ll explain just why this is problematic, partly by analogy with questions about the metaphysics of classical mechanics.

23 October : Daniel Bedingham

“Time reversal symmetry and collapse models”

Abstract: Collapse models are modifications of quantum theory where the wave function is treated as physically real and collapse of the wave function is a physical process. This introduces a time reversal asymmetry into the dynamics of the wave function since the collapses affect only the future state. However, it is shown that if the physically real part of the model is reduced to the set of points in space and time about which the collapses occur then a collapsing wave function picture can be given both forward and backward in time, in each case satisfying the Born rule (under certain conditions). This implies that if the collapse locations can serve as an ontology then these models can in fact have time reversal symmetry.

16 October : Dennis Lehmkuhl

“Einstein, Cartan, Weyl, Jordan: The neighborhood of General Relativity in the space of spacetime theories”

Abstract: Recent years have seen a renewed interest in Newton-Cartan theory (NCT), i.e. Newtonian gravitation theory reformulated in the language of differential geometry. The comparison of this theory with the general theory of relativity (GR) has been particularly interesting, among other reasons, because it allows us to ask how `special’ GR really is, as compared to other theories of gravity. Indeed, the literature so far has focused on the similarities between the two theories, for example on the fact that both theories describe gravity in terms of curvature, and the paths of free particles as geodesics. However, the question of how `special’ GR is can only be properly answered if we highlight differences as much as similarities, and there are plenty of differences between NCT and GR. Furthermore, I will argue that it is not enough to compare GR to simpler theories like NCT, we also have to compare it to more complicated theories; more complicated in terms of geometrical structure and gravitational degrees of freedom. While NCT is the most natural degenerative limit of GR, gravitational theory defined on a Weyl geometry (to be distinguished from a unified field theory based on Weyl geometry) and gravitational scalar-tensor theories (like Jordan-Brans-Dicke theory) are two of the most natural generalisations of GR. Thus, in this talk I will compare Newton-Cartan, GR, Weyl and Jordan-Brans-Dicke theory, to see how special GR really is as compared to its immediate neighborhood in the `space of spacetime theories’.

19 June : Antony Valentini

“Hidden variables in the early universe II: towards an explanation for large-scale cosmic anomalies”

Abstract: Following on from Part I, we discuss the large-scale anomalies that have been reported in measurements of the cosmic microwave background (CMB) by the Planck satellite. We consider how the anomalies might be explained as the result of incomplete relaxation to quantum equilibrium at long wavelengths on expanding space (during a ‘pre-inflationary phase’) in the de Broglie-Bohm formulation of quantum theory. The first anomaly we consider is the reported large-scale power deficit. This could arise from incomplete relaxation for the amplitudes of the primordial perturbations. It is shown, by numerical simulations, that if the pre-inflationary era is radiation dominated then the deficit in the emerging power spectrum will have a characteristic shape (a specific dependence on wavelength). It is also shown that our scenario is able to produce a power deficit in the observed region and of the observed magnitude, for an appropriate choice of cosmological parameters. The second anomaly we consider is the reported large-scale anisotropy. This could arise from incomplete relaxation for the phases of the primordial perturbations. We report on recent numerical simulations for phase relaxation, and we show how to define characteristic scales for amplitude and phase nonequilibrium. While difficult questions remain concerning the extent to which the data might support our scenario, we argue that we have an (at least) viable model that is able to explain two apparently independent cosmological anomalies at a single stroke.

12 June : Antony Valentini

“Hidden variables in the early universe I: quantum nonequilibrium and the cosmic microwave background”

Abstract: Assuming inflationary cosmology to be broadly correct, we discuss recent work showing that the Born probability rule for primordial quantum fluctuations can be tested (and indeed is being tested) by measurements of the cosmic microwave background (CMB). We consider in particular the hypothesis of ‘quantum nonequilibrium’ — the idea that the universe began with an anomalous distribution of hidden variables that violates the Born rule — in the context of the de Broglie-Bohm pilot-wave formulation of quantum field theory. An analysis of the de Broglie-Bohm field dynamics on expanding space shows that relaxation to quantum equilibrium is generally retarded (and can be suppressed) for long-wavelength field modes. If the initial probability distribution is assumed to have a less-than-quantum variance, we may expect a large-scale power deficit in the CMB — as appears to be observed by the Planck satellite. Particular attention is paid to conceptual questions concerning the use of probabilities ‘for the universe’ in modern theoretical and observational cosmology.
[Key references: A. Valentini, ‘Inflationary Cosmology as a Probe of Primordial Quantum Mechanics’, Phys. Rev. D 82, 063513 (2010) [arXiv:0805.0163]; S. Colin and A. Valentini, ‘Mechanism for the suppression of quantum noise at large scales on expanding space’, Phys. Rev. D 88, 103515 (2013) [arXiv:1306.1579].]

5 June : Mike Cuffaro

“Reconsidering quantum no-go theorems from a computational perspective”

Bell’s and related inequalities are misleadingly thought of as “no-go” theorems, except in a highly qualified sense. More properly, they should be understood as imposing constraints on locally causal models which aim to recover quantum mechanical predictions. Thinking of them as no-go theorems is nevertheless mostly harmless in most circumstances; i.e., the necessary qualifications are, in typical discussions of the foundations of quantum mechanics, understood as holding unproblematically. But the situation can change once we leave the traditional context. In the context of a discussion of quantum computation and information, for example, our judgements regarding which locally causal models are to be ruled out as implausible will be different than our similar judgements in the traditional context. In particular, the “all-or-nothing” GHZ inequality, which is traditionally considered to be a more powerful refutation of local causality than statistical inequalities like Bell’s, has very little force in the context of a discussion of quantum computation and information. In this context it is only the statistical inequalities which can legitimately be thought of as no-go theorems. Considering this situation serves to emphasise, I argue, that there is a difference in aim between practical sciences like quantum computation and information, and the foundations of quantum mechanics traditionally construed: describing physical systems as they exist and interact with one another in the natural world is different from describing what one can do with physical systems.

22 May: Elise Crull

“Whence Physical Significance in Bimetric Theories?”

Recently there has been lively discussion regarding a certain class of alternative theories to general relativity called bimetric theories. Such theories are meant to resolve certain physical problems (e.g. the existence of ghost fields and dark matter) as well as philosophical problems (e.g. the apparent experimental violation of relativistic causality and assigning physical significance to metrics).

In this talk, I suggest that a new type of bimetric theory wherein matter couples to both metrics may yield further insights regarding those same philosophical questions, while at the same time addressing (perhaps to greater satisfaction!) the physical worries motivating standard bimetric theories.

15 May: Julian Barbour

“A Gravitational Arrow of Time”

My talk (based on arXiv: 1310.5167 [gr-qc]) will draw attention to a hitherto unnoticed way in which scale-invariant notions of complexity and information can be defined in the problem of N point particles interacting through Newtonian gravity. In accordance with these definitions, all typical solutions of the problem with nonnegative energy divide at a uniquely defined point into two halves that are effectively separate histories. They have a common ‘past’ at the point of division but separate ‘futures’. In each half, the arrow from past to future is defined by growth of the complexity and information. All previous attempts to explain how time-symmetric laws can give rise to the various arrows of time have invoked special boundary conditions. In contrast, the complexity and information arrows are inevitable consequences of the form
of the gravitational law and nothing else. General relativity
shares key structural features with Newtonian gravity, so it may be possible to obtain similar results for Einsteinian gravity.

8 May : Simon Saunders

“Reference to indistinguishables, and other paradoxes”

Abstract:There is a seeming-paradox about indistinguishables: if described only by totally symmetric properties and relations, or by totally (anti)-symmetrized states, then how is reference to them possible? And we surely do refer to subsets of indistinguishable particles, and sometimes individual elementary particles (as in: the electrons, protons, and neutrons of which your computer screen is composed). Call it the paradox of composition.
The paradox can be framed in the predicate calculus as well, in application to everyday things: indistinguishability goes over to weak discernibility. It connects with two other paradoxes: the Gibbs paradox and Putnam’s paradox. It also connects with the hole argument in General Relativity. They are none of them the same, but they have a common solution.

This solution centres on the way that mathematical representations, including the set-theoretical constructions of model theory, connect with the world. The connection is by structural similarity, not by coordinates or particle labels (in physics), or (in model theory) by elements of sets. The appearance to the contrary is fostered by the simplicity of ostensive reference, on the one hand, and the assignment to structures of particle labels, coordinates, and elements of sets, on the other.

1 May : Oliver Pooley

“New work on the problem of time”

Abstract: One aspect of the “Problem of Time” in canonical general relativity results from applying to the theory Dirac’s seemingly well-established method of identifying gauge transformations in constrained Hamiltonian theories. This “orthodox” move identifies transformations generated by the first-class constraints as mere gauge. Applied to GR, the strategy yields the paradoxical result that no genuine physical magnitude takes on different values at different times. This orthodoxy is also what underwrites the derivation of the timeless Wheeler–DeWitt equation. It is thus intimately connected to one of the central interpretative puzzles of the canonical approach to quantum gravity, namely, how to make sense of a profoundly timeless quantum formalism.

This talk reviews some recent challenges to the technical underpinning of the orthodox view. Brian Pitts has argued that, in general, first-class constraints generate “not a gauge transformation, but a bad physical change”, even for theories like electromagnetism that are standardly taken to illustrate the correctness of orthodoxy. I argue that Pitts’ results are largely orthogonal to resolving the Problem of Time, and that they leave the orthodox interpretation of phase space untouched. Instead, I will endorse a very different criticism of Dirac’s position, due to Barbour and Foster. As Thébault has stressed, one moral is that a Hamiltonian theory can be manifestly deterministic even if physical magnitudes do not commute with some of the first-class constraints, namely, those that generate reparameterizations of histories. Unfortunately, due to its foliation (in distinction to reparameterization) invariance, Hamiltonian GR suffers from a residual apparent indeterminism. Replacing GR by shape dynamics is one “solution”. I will consider the prospects of finding an alternative.

13 March: Philip Goyal

“An Informational Approach to Identical Particles in Quantum Theory”

Abstract: A remarkable feature of quantum theory is that particles with identical intrinsic properties must be treated as indistinguishable if the theory is to give valid predictions.  In the quantum formalism, indistinguishability is expressed via the symmetrization postulate, which restricts a system of identical particles to the set of symmetric states (`bosons’) or the set of antisymmetric states~(`fermions’).

However, the precise connection between particle indistinguishability and the symmetrization postulate has not been clearly established.  There exist a number of variants of the postulate that appear to be compatible with particle indistinguishability.  In particular, the widely influential topological approach due to Laidlaw & DeWitt and Leinaas & Myrheim implies that its validity depends on the dimensionality of space.  This variant leaves open the possibility that identical particles are generically able to exhibit so-called anyonic behavior in two spatial dimensions.

Here we show that the symmetrization postulate can be derived on the basis of a simple novel postulate.  This postulate establishes a functional relationship between the amplitude of a process involving indistinguishable particles and the amplitudes of all possible transitions when the particles are treated as distinguishable.  The symmetrization postulate follows by requiring consistency with the rest of the quantum formalism.  The key to the derivation is a strictly informational treatment of indistinguishability which prohibits the labelling of particles that cannot be experimentally distinguished from one another.  The derivation implies that the symmetrization postulate admits no natural variants.  In particular, the possibility that identical particles generically exhibit anyonic behaviour is excluded.

[1] “Informational Approach to Identical Particles in Quantum Theory”, http://arxiv.org/abs/1309.0478

6 March: Sean Gryb

“Symmetry and Evolution in Quantum Gravity”

Abstract: A key obstruction for obtaining a non-perturbative definition of quantum gravity is the absence of a sensible quantum representation of spacetime refoliations, including global refoliations – or reparametrizations. We propose that these difficulties can be avoided by following a procedure for defining a degree of freedom due to Poincare, where emphasis is put on the independently specifiable initial data of the system, and a proposal for the decomposition of these degrees of freedom due to York (both of these ideas have later been advocated by Barbour). In our proposal, local refoliations are replaced by local scale transformations using a symmetry trading procedure developed in the “Shape Dynamics” approach to classical gravity. Then, global refoliations are dealt with using a technique similar to that used in unimodular gravity. We will first try to provide the philosophical motivation for our procedure then propose a set of formal equations which represent the quantization of a theory that is classically equivalent to General Relativity. However, the quantum theory we will propose has both a well-defined notion of local symmetry and global time evolution. Time permitting, we will also discuss explicit symmetry reduced toy models exhibiting some of the key features of our proposal.

20 February: Tessa Baker

“Cosmological Tests of Gravity”

Abstract: The past decade has witnessed a surge of interest in extensions of Einstein’s theory of General Relativity. It is hoped that such theories of `modified gravity’ might account for the observed accelerating expansion rate of the universe, providing a more satisfactory and physical explanation than that of a simple cosmological constant.
I will give an overview of current attempts to extend GR, and how to test them observationally. I’ll describe a formalism that has been constructed to carry out these tests, a cosmological analogue of the Parameterised Post-Newtonian framework (PPN) that is used to test gravity in the Solar System. I’ll show how this new formalism acts as a bridge between the (sometimes disparate!) worlds of theory and observation, allowing us to make real progress in our understanding of gravity.

13 February: Joerg Schmiedmayer

“How does the classical world emerge from microscopic quantum evolution”

Abstract: The world around us follows the laws of classical physics, processes are irreversible, and there exists a ‘arrow of time’.  On the microscopic scale our world is governed by quantum physics, its evolution is ‘unitary’ and reversible. How does the classical world emerge from the microscopic quantum world?  We conjecture that the classical world naturally emerges from the microscopic quantum world through the complexity of large quantum systems.  We have now for the first time the ability to probe this conjecture in the laboratory. Modern experimental techniques allow us to monitor the evolution of isolated quantum systems in detail. First experiments using ensembles of ultra cold atoms allow us to test fundamental aspects of the quantum to classical transition in such an quantum system completely isolated from it environment. I will present the concepts behind the emergence conjecture and show first experiments, which probe some of the fundamental aspects of it.

November 14: Jeff Bub, University of Maryland

“Quantum Interactions with Closed Timelike Curves and Superluminal Signaling”

Abstract: There is now a significant body of results on quantum interactions with closed timelike curves (CTCs) in the quantum information literature, for both the Deutsch model of CTC interactions (D-CTCs) and the projective model (P-CTCs). As a consequence, there is a prima facie argument exploiting entanglement that CTC interactions would enable superluminal and, indeed, effectively instantaneous signaling. In cases of spacelike separation between the sender of a signal and the receiver, whether a receiver measures the local part of an entangled state or a disentangled state to receive the signal can depend on the reference frame. A proposed consistency condition gives priority to either an entangled perspective or  a disentangled perspective in spacelike separated scenarios. For D-CTC interactions, the consistency condition gives priority to frames of reference in which the state is disentangled, while for P-CTC interactions the condition selects the entangled state. It follows that there is a procedure that allows Bob to signal to Alice in the past via relayed superluminal communications between spacelike separated Bob and Clio, and spacelike separated Clio and Alice. This opens the door to time travel paradoxes in the classical domain. Ralph (arXiv1107.4675) first pointed this out for P-CTCs, but Ralph’s procedure for a ‘radio to the past’ is flawed. Since both D-CTCs and P-CTCs allow classical information to be sent around a spacetime loop, it follows from a result by Aaronson and Watrous (Proc.Roy.Soc. A, 465:631–647, 2009) for CTC-enhanced classical computation that a quantum computer with access to P-CTCs would have the power of PSPACE, equivalent to a D-CTC-enhanced quantum computer. (The talk represents joint work with Allen Stairs.)

November 7 Sam Fletcher, University of California at Irvine

“On the Reduction of General Relativity to Newtonian Gravitation”

Abstract: Accounts of the reduction of general relativity (GR) to Newtonian gravitation (NG) usually take one of two approaches. One considers the limit as the speed of light c → ∞, while the other focuses on the limit of formulae (e.g., three-momentum) in the low-velocity limit, i.e., as v/c ≈ 0.  Although the first approach treats the reduction of relativistic spacetimes globally, many have argued that ‘c → ∞’ can at best be interpreted counterfactually, which is of limited value in explaining the past empirical success of NG.  The second, on the other hand, while more applicable to explaining this success, only treats a small fragment of GR.  Further, it usually applies only locally, hence is unable to account for the reduction of global structure.  Building on work by Ehlers, I propose a different account of the reduction relation that offers the global applicability of the c → ∞ limit while maintaining the explanatory utility of the v/c ≈ 0 approximation.  In doing so, I highlight the role that a topology on the collection of all spacetimes plays in defining the relation, and how the choice of topology corresponds with broader or narrower classes of observables that one demands be well-approximated in the limit.

 

October 31 Paul Hoyningen-Heune, Leibniz University of Hannover.

“The dead end objection against convergent realisms.”

Abstract: The target of the dead end objection is any kind of scientific realism that bases its plausibility on the stable presence of some X in a sequence of succeeding theories. For instance, if X is a set of theoretical entities that remains stable even over some scientific revolutions, this may be taken as support for convergent scientific realism about entities. Likewise, if X is a similarly stable set of structures of theories, this may be taken as support for (convergent) structural realism. The dead end objection states that the conceded stability of X could also be due to the existence of an empirically extremely successful though ontologically significantly false theory. In this case, the inference from the stability of X to the probable reality of X would become invalid. Three examples from the history of science illustrate how the stability of some X over an extended period of time was indeed erroneously taken to indicate the finality of X.

 

October 24 Basil Hiley, Birkbeck College, University of London.

“Bohmian Non-commutative Dynamics: Local Conditional Expectation Values are Weak Values.”

Abstract: Quantum dynamics can be described by two non-commutative geometric Clifford algebras, one of which describes the properties of the covering space of the symplectic manifold [1]. This gives rise to a non-commutative probability theory with conditional expectation values that correspond to local quantum properties which appear as weak values [3]. Examples of these are the T0μ(x) components of the energy-momentum tensor which, in turn, cor- respond to the Bohm momentum and Bohm energy for Schr ¨odinger, Pauli and Dirac particles [2]. In the case of photons, the Bohm momentum has already been measured by Kocis [4]. I will explain the theoretical background and discuss some new experiments involving weak measurements on non- zero rest mass particles that are being developed by Robert Flack [UCL] and myself to explore these ideas further.

 

October 17 Edward Anderson, DAMPT, Cambridge University.

“Background independence”

Abstract: This talk concerns what background independence itself is (as opposed to some particular physical theory that is background independent). This notion mostly arises from a layer-by-layer analysis of the facets of the Problem of Time in Quantum Gravity. Part of this notion consists of relational postulates. These are identified as classical precursors of two of the facets, and are tied to the forms of the GR Hamiltonian and momentum constraints respectively. Other aspects of Background Independence include the algebraic closure of these constraints, expressing physics in terms of beables, foliation-independence, the reconstruction of spacetime from space. The final picture is that background independence – a philosophically desirable and physically implementable feature for a theory to have – has the facets of the Problem of Time among its consequences. Thus these arise naturally and are problems to be resolved, as opposed to avoided `by making one’s physics background-dependent in order not to have these problems’. This serves as a selection criterion that limits the use of a number of model arenas and physical theories.