David summarizes his distinction between inference by prediction/retrodiction and inference by measurement, and reiterates that this can make plausible the claim that the special status of the Past-Hypothesis can ground the asymmetry in our epistemic relations to the past and future. He goes through some objections to this view that came up in the class before, which he and Tim then discuss.
David relates this epistemic asymmetry to the time-asymmetries in causation, counterfactual conditionals, and our apparent ability to elicit change in the world. He argues that the very facts that make the past epistemically more accessible than the future also makes our causal handle on the future far far weaker than our handle on the past. There is a discussion of how this is supposed to fit with our pre-theoretic intuition that we have NO causal handle on the past.
Tim weighs in on some of these points, criticizing David's desire for a "mechanical" explanation for temporal asymmetries. He argues that our notion of a mechanical explanation already presupposes a substantive temporal asymmetry. David discusses the relation between the Mentaculus (which he takes to be the source of mechanical explanations) and the notion that time passes/has a direction. Tim responds and puts some pressure on David's claim that these two are largely independent. There is a lengthy discussion of this point.
Tim opens with a brief point about the postulation of a very low entropy past state. He contrasts the claim that, if this state is not the initial state of the world, that the entropy profile of the universe will be symmetric (intuitively, with a global minimum at the moment of the past hypothesis) and the claim that the macroscopic profile of the universe will be symmetric in the same way. David takes some time to respond to this point.
David then turns the discussion to the inferential role of the past hypothesis. He starts by identifying two kinds of inferences that can be used to determine the state of the world at times other than the present: Inference by Prediction/Retrodiction (in which one takes facts about the present and applies the laws to determine facts about the past or future) and Inference by Measurement (in which one takes facts about the present plus facts about certain past events and determines facts about the past or future). Inference by Measurement is, David points out, much stronger, but its appeal to certain past events is in need of justification. David argues that the event postulated by the past hypothesis is precisely the privileged past state that can do this inferential work. Tim asks whether these inference types really exhaust the relevantly available inference procedures. He also wonders whether the sorts of inferences David wants to call inferences by measurement really always have to appeal to facts about states other than the present, specifically the past hypothesis. There is a long discussion of the predictive centrality/fundamentality of the past hypothesis and of the nature of records.
Barry Loewer is the guest lecturer for this class. Tim starts by recapitulating the notion of typicality introduced in the previous lecture. Tim, David and Barry then engage in a debate about Tim's view of the nature of statistical explanation. Barry then gives a summary of the difference between his and David's view of statistical mechanics and Tim's view. He starts by arguing that one main point of contention is the "imperialistic" nature of his and David's project. A second main point of contention is the nature of laws and time, which is the topic for the remainder of the lecture. Barry contrasts two conceptions of laws. The first one is the Humean view of laws on which laws of nature are reducible to facts about the Humean mosaic, which goes hand-in-hand (Barry claims) with a view of time on which time doesn't have an intrinsic direction. The second is Tim's view, on which laws of nature are a primitive category in our ontology, and which (Barry argues) goes naturally with a view of time on which it has an intrinsic direction.
Tim and David begin by continuing their discussion of the 'Stosszahlansatz', the assumption that the number of particle collisions in a given subregion is just proportional to the volume of that region. They summarize the debate up to this point and attempt to clarify the nature of their disagreement about its explanatory value. Tim then takes over and gives a short primer on the mathematics of measure theory, connecting it with the purely mathematical notion of a "probability measure" and contrasting that with the physical notion of something being probabilistic in any real sense.
Tim relates this discussion by example to the notion of Typicality. A behavior of a system is typical of that system iff the set of initial conditions that yield that behavior has measure 1 (on any normalized continuous measure). When a certain limit frequency is typical, then we have something very close in form and content to an objective, deterministic probability. Tim goes on to discuss how this model can be
extended to adequately include finite systems or finite, and then he and David debate at length whether this extension preserves the virtues of the view as first presented.
Time Maudlin starts by distinguishing and characterizing three different classes of explanations of thermodynamic phenomena previously studied in the seminar (appeals to symmetry principles, to random walks in the phase space, or to features of the actual micro-dynamics of isolated systems). He explains the various ways in which the standard measure over phase space comes into play in these explanations. David Albert then examines the various ways in which one might justify appealing to this standard measure; both Maudlin and Albert reject justifications that appeal to principles of indifference, but disagree on whether the measure is to be justified solely on empirical grounds (David) or by its natural character (Tim). In the last part of the class, David starts explaining the reversibility objections raised against Maxwell’s and Boltzmann’s arguments.
In this lecture, Tim Maudlin continues the statistical mechanical account of the velocity distribution of particles in a monatomic gas, and how this distribution evolves over time. David Albert and Tim disagree about the status of the 'Stosszahlansatz'', the assumption that the location of particles in the system, and the locations of critical regions required to impact on other particles, are statistically independent of each other. Tim argues that when an assumption of statistical independence such as this one is reached, no further explanation of the independence is required. David begins to argue that even in these cases, one must explain the independence by appealing to laws. Tim and David begin to outline their differing conceptions of laws.
David Albert begins with a brief discussion of an issue raised in the last lecture- whether you can define an entropy for a non-equilibrium system. He and Tim Maudlin agree that while the steps along the reversible route must involve equilbrium states, partitioning can still be used to define the entroy of a non-equilibrium system.
Tim returns to the discussion of statistical mechanics began in the last lecture, addressing how we can calculate and explain the velocity distribution of gas particles at equilibrium. Following work by Boltzman and Maxwell, we can model the gas as a system of particles that colide, and calculate the 'critical areas' which particles of a given velocity must be in if they are to colide with other gas particles in a given time. We then have a dynamics for how a velocity distribution will evolve. By assuming the 'Stosszahlansatz'', that the number of particles in the critical areas is proportional to the number of particles in the system, we can show the system will evolve towards a particular distribution.
In this lecture, David Albert continues his introduction to thermodynamics, explaining how the entropy of a system is defined and determined in thermodynamic terms. This requires distinguishing between reversible and irreversible paths between two states of a system, and envisaging how pistons and heat baths can be used to mimic the irreversible transfer of heat between a warm body and a cool one. Tim Maudlin and David disagree on whether the entropy of a non-equilibrium system can be defined.
Tim begins his discussion of statistical mechanics by considering equilibrium states- where the macroscopic description of the state no longer changes over time. (He also introduces the problem of the adiabatic piston--for some systems, it is controversial to define the equilibrium state.) We then consider how the macroscopic thermodynamic properties of the gas can be understood using a statistical mechanical model, which treat the gas as a system of monotonic moving particles. At equilibrium, such a system will tend towards the Maxwell-Boltzmann state, where the velocities of gas particles are randomly distributed. He argues that in order to explain this, we need to appeal to the dynamics of the system, and how the particles are interacting. Appeal to statistics, without dynamics, are not enough.
This is the second lecture in the joint NYU/Columbia graduate course on philosophy of statistical explanation and cosmology. It's David Albert's first lecture, and in it he introduces the main ideas he'll discuss in the class, with some contrast between his approach and that of our other lecturer, Tim Maudlin (NYU).