CAUSATION

Chapter 6

Humean Probabilistic Analyses of Causation


      

6. Probabilistic Approaches:  Causation and Relative Frequencies

         Through the end of the 19th Century, almost all philosophers thought of causation as connected with conditions that were totally sufficient to ensure the occurrence of an event.  But that changed in the 20th Century, with the emergence of quantum physics, and the development of the social sciences, and many philosophers gradually came to feel that causation is not restricted to cases where there are causally sufficient conditions for the occurrences of events.

         What implications does this have for the philosophy of causation?  One possible view is that it has very little relevance.  For while quantum physics certainly appears to provide excellent reason for holding that causation can be present in situations that do not fall under deterministic laws, this need not imply that one's concept of causation has to be revised.  Perhaps all that is needed is a concept of probabilistic laws, a concept which can then be combined with one's prior concept of causation to generate a satisfactory account of causation in probabilistic settings.

         But there is also a very different possibility that needs to be explored:  perhaps the right route involves an account of causation that is itself genuinely probabilistic, so that the concept of probability, rather than merely entering via probabilistic laws, is part of the very analysis of the relation of causation itself.

 6.1  The Basic Approach

         The earliest attempts to formulate a probabilistic analysis of causation were advanced by Hans Reichenbach (1956), I. J. Good (1961 and 1962), and Patrick Suppes (1970), and they all were based upon the idea of probability understood in terms of relative frequency.  Moreover, no use was made of the idea of laws of nature, let alone of a realist conception of laws, so that what Reichenbach, Good, and Suppes offered were strong reductionist accounts of causation, and ones that involved only Humean states of affairs.

         At the heart of any probabilistic analysis of causation is the idea that causes must, in some way, make their effects more likely, and within these initial probabilistic accounts of causation the basic idea was to analyze what it is for a cause to make its effect more likely in terms of the notion of positive statistical relevance, where an event of type B is positively relevant to an event of type A if and only if the conditional probability of an event of type A relative to an event of type B is greater than the unconditional probability of an event of type A.   Thus Suppes (1984, p. 151), for example, introduces the notion of a prima facie cause, defined as follows: "An event B is a prima facie cause of an event A if and only if (i) B occurs earlier than A, and (ii) the conditional probability of A occurring when B occurs is greater than the unconditional probability of A occurring."

         Perhaps the most crucial test for any theory of causation is whether it can provide a satisfactory account of the direction of causation.  What account can be offered, given a probabilistic approach?  One possibility, of course, is to incorporate the earlier than relation into one's analysis of causation, and to use that relation to define the direction of causation -- as was done, for example, by Suppes (1970).

         It is widely thought, however, that this is not satisfactory.  One reason is that it then follows immediately both that it is logically impossible for a cause and its effect to be simultaneous, and for a cause to be later than its effect, and while both things may be the case, the fact that many people have thought, for example, that time travel into the past is logically possible surely provides good reason for holding that it cannot be an immediate consequence of the analysis of causation that backward causation is logically impossible.

         Another consideration is that there is a serious problem about what it is that is the basis of the direction of time, and a causal theory of time has been thought by many philosophers to be a possibility worthy of serious consideration.  If so, then the direction of causation cannot be defined in terms of the direction of time.

         Because of considerations such as these, most advocates of a probabilistic approach to causation have wanted to analyze the direction of causation in probabilistic terms.  What are the prospects for doing this?  The first thing to note is that the postulate that a cause raises the probability of its effect does not itself provide any direction for causal processes.  For when the following equation for conditional probabilities

 Prob(E/C) x Prob(C)  =  Prob(E & C)  =  Prob(C/E) x Prob(E)

 is rewritten as

 Prob(E/C)/Prob(E)  =   Prob(C/E)/Prob(C)

 one can see that Prob(E/C) > Prob(E)  if and only if  Prob(C/E) > Prob(C)

 So causes raise the probabilities of their effects only if effects also raise the probabilities of their causes.

         How, then, can the direction of causation be analyzed probabilistically?  The most promising suggestion was set out by Reichenbach in his book The Direction of Time (1956).  Reichenbach's proposal involves the following elements:  first, what he referred to as 'the Principle of the Common Cause'; secondly, a probabilistic characterization of a 'conjunctive fork'; thirdly, a proof that correlations between event-types can be explained via conjunctive forks; and, fourthly, a distinction between open forks and closed forks.

         As regards the first element, Reichenbach's Principle of the Common Cause is as follows:  'If an improbable coincidence has occurred, there must exist a common cause.'  (Reichenbach, 1956, p. 157)  Here the basic claim is that if events of type A, say, are more likely to occur given events of type B, than in the absence of events of type B, and if the explanation of this is not that events of type A are caused by events of type B, or vice versa, then there must some third type of event -- say, C -- such that events of type C cause both events of type A and events of type B.

         Secondly, there is Reichenbach's characterization of the idea of a conjunctive fork, which - using a slightly different notation - can be set out as follows (1956, p. 159):

 Events of types A, B, and C form a conjunctive fork if and only if:

 (1)  Prob(A & B/C) = Prob(A/C) x Prob(B/C)

 (2)  Prob(A & B/not-C) = Prob(A/not-C) x Prob(B/not-C)

 (3)  Prob(A/C) > Prob(A/not-C)

 (4)  Prob(B/C) > Prob(B/not-C)

         Thirdly, Reichenbach then shows that, provided that none of the relevant probabilities is equal to zero, equations (1) through (4) entail:

 (5)  Prob(A & B) > Prob(A) x Prob(B)

 This in turn entails:

 (6)  Prob(A/B) > Prob(A)

 (7)  Prob(B/A) > Prob(B)

     So we see that the existence of a conjunctive fork involving event-types A, B, and C provides an explanation of a statistical correlation between the event-types A and B.

         Finally, Reichenbach then distinguishes between open forks and closed forks.  Suppose that events of types A, B, and C form a conjunctive fork, and that there is no other type of event -- call it E -- such that events of types A, B, and E also form a conjunctive fork.  Then A, B, and C form an open fork.  On the other hand, if there is another type of event, E, such that events of types A, B, and E also form a conjunctive fork, what one has is a closed fork.

         As Reichenbach emphasizes, there can certainly be conjunctive forks that involve common effects, rather than common causes. (1956, pp. 161-2).  But since conjunctive forks can, as we have just seen, explain statistical correlations, if there were an open fork that involved a common effect, then the relevant statistical correlation would be explained, even though there was no common cause, and this would violate the Principle of the Common Cause.  Hence, conjunctive forks involving a common effect must, if Reichenbach is right, always be closed forks.  All open forks, therefore, must involve a common cause, and so the direction of causation is fixed by the direction given by open forks.

 6.2  Objections

         This is a subtle and ingenious attempt to offer a probabilistic analysis of the relation of causation, and one that appeals only to Humean states of affairs.  Unfortunately, it appears to be open to a number of decisive objections.

 6.2.1  Accidental, Open Forks Involving Common Effects

         The basic idea here is simply this.  Suppose that A and B are types of events that do not cause one another, and for which there is no common cause.  Then it might be the case that the conditional probability of an event of type A given an event of type B was exactly equal to the unconditional probability of an event of type A, but surely this is not necessary.  Indeed, it would be more likely that the two probabilities were at least slightly different, so that the conditional probability of an event of type A given an event of type B was either greater than or less than the unconditional probability of an event of type A.

         Let us suppose, then, that the third of these alternatives is the case.  Suppose, further, that the occurrence of an event of type A is a causally necessary condition for the occurrence of a slightly later event of type E, and, similarly, that the occurrence of an event of type B is a causally necessary condition for the occurrence of a slightly later event of type E.

         Finally, let us suppose -- as is perfectly compatible with the preceding assumptions -- that the relative numbers of all possible combinations of events of types A, B, and E, throughout the whole history of the universe, are given by the following table:

 

                                                   E                                             Not-E

                                          A          Not-A                       A                      Not-A

 B                                      1             0                           18                          12

 Not-B                              0              0                           42                          28

         From this table, one can see that Prob(A) = 61/101, or about .604, while Prob(A/B) = 19/31, or slightly less than .613, so that, if the absolute numbers are not too large, there will be nothing especially remarkable about the fact that Prob(A/B) > Prob(A)

         Next, examining the numbers that fall under 'E', we can see that we have the following probabilities:

 Prob(A/E) = 1;  Prob(B/E) = 1;  Prob(A & B/E) = 1.

 Hence the following is true:

 (1)  Prob(A & B/E) = Prob(A/E) x Prob(B/E)

         Similarly, examining the numbers that fall under 'Not-E', we can see that we have the following probabilities:

 Prob(A/Not-E) = 60/100 = 0.6;  Prob(B/Not-E) = 30/100 = 0.3;  Prob(A & B/E) = 18/100 = 0.18.

         So the following three equations are also true:

 (2)  Prob(A & B/not-E) = Prob(A/not-E) x Prob(B/not-E)

 (3)  Prob(A/E) > Prob(A/not-E)

 (4)  Prob(B/E) > Prob(B/not-E)

         Hence, in a universe of the sort just described, the three types of events A, B, and E form a conjunctive fork.  Moreover, since there is, by hypothesis, no type of event, C, that is a common cause of events of types A and B, it is therefore the case that A, B, and E constitute an open fork.  This open fork then defines the relevant direction of causation as the direction that runs from events of type E towards events of the two types, A and B, that are causally necessary conditions for the occurrence of an event of type E.

         In short, not only is it logically possible to have an open fork that involves a common effect, rather than a common cause, but there is no significant unlikelihood associated with the occurrence of such an open fork.  The direction of open forks cannot, therefore, serve to define the direction of causation.

 6.2.2  Underived Laws of Co-Existence

         John Stuart Mill suggested that, in addition, to causal laws, there could be basic laws of necessary co-existence that related simultaneous states of affairs.  Are such laws possible?  If one considers some candidates that might be proposed, it may be tempting, I think, to be attracted to the idea that although there can be laws of necessary co-existence, all such laws are derived, rather than basic, though this idea is far from unproblematic.  Thus, consider, for example, a Newtonian world, and Newton's Third Law of Motion -- that if one body, X exerts a certain force, F, on another body Y, then Y exerts a force equal in magnitude to F, and opposite in direction, upon X.  This certainly asserts the existence of a necessary connection between simultaneous states of affairs, but is it correctly viewed as a basic law, in a Newtonian universe?  Doubts arise, I think, in view of the fact that the fundamental force laws entail conclusions such as the following:

 (a)  It is a law that for any objects X and Y, and any time t, if X exerts a gravitational force F on Y at time t, then Y exerts a gravitational force -F on X at time t.

 (b)  It is a law that for any objects X and Y, and any time t, if X exerts an electrostatic force F on Y at time t, then Y exerts a gravitational force -F on X at time t.

 (c)  It is a law that for any objects X and Y, and any time t, if X exerts a magnetic force F on Y at time t, then Y exerts a gravitational force -F on X at time t.

         So non-causal laws that are special instances of Newton's Third Law of Motion can be derived from the fundamental force laws, and the latter are, if one treats forces realistically, causal laws.

         But this is not, of course, a derivation of Newton's Third Law of Motion itself.  To have the latter, it would have to be the case that there was a law to the effect that there were only certain types of forces:  gravitational, electrostatic, magnetic, etc.  Moreover, even if the latter were a law in a Newtonian universe, it would not be a causal law, and so one would still not have a derivation of Newton's Third Law of Motion from causal laws alone.

         It is, accordingly, far from clear that Newton's Third Law of Motion can be derived from causal laws.  But a philosopher who wishes to maintain that all basic laws are causal laws has a different response available -- namely that, in a Newtonian universe, Newton's Third Law of Motion would not really be a law in the strict sense: it would be, instead, a generalization based upon the forces and force laws that have been discovered to this point.
 
         Whether this response is ultimately correct, I do think that it shows at least that it is unclear whether Newton's Third Law of Motion would be a case of a basic, non-causal law of co-existence.  But even if this particular example is doubtful, how can one rule out the possibility of there being such laws?  Why could it not be a law, for example, that all particles with mass M have charge C, and vice versa, without that law's being derivable from any other laws whatever?  The claim that this is not possible surely requires an argument.  But what could the argument possibly be?

         In the absence of a proof of the impossibility of basic, non-causal laws of co-existence, it seems to me that one is justified in holding that such laws are logically possible.  But if this is right, then Reichenbach's Principle of the Common Cause is unsound, since the extremely improbable coincidence that all particles with mass M have charge C, and vice versa, rather than being explained causally, might simply obtain in virtue of a basic, non-causal law.

 6.2.3  Underived Laws of Co-Existence, and Non-Accidental, Open Forks Involving Common Effects

         If there can be such laws, that also allows one to show that there can be open forks involving common effects that, rather than depending upon accidents of distribution, arise simply in virtue of certain laws.  In particular, consider a world in which the following things are the case:

 (a)  The occurrence of an event of type A is a causally necessary condition for the occurrence of a slightly later event of type E;

 (b)  The occurrence of an event of type B is a causally necessary condition for the occurrence of a slightly later event of type E;

 (3c  The co-occurrence of an event of type A and an event of type B is a causally sufficient condition for the occurrence of a slightly later event of type E;

 (d)  It is a basic, non-causal law that an event of type A is always accompanied by an event of type B, and vice versa.

         Then, provided that there is at least one occurrence of an event of type E, the following probabilities must obtain in virtue of (a) through (d):

 Prob(A/E) = 1;  Prob(B/E) = 1;  Prob(A & B/E) = 1.

 Prob(A/not-E) = 0;  Prob(B/not-E) = 0;  Prob(A & B/not-E) = 0.

         It then follows that the following four equations are all true:

 (1)  Prob(A & B/E) = Prob(A/E) x Prob(B/E)

 (2)  Prob(A & B/not-E) = Prob(A/not-E) x Prob(B/not-E)

 (3)  Prob(A/E) > Prob(A/not-E)

 (4)  Prob(B/E) > Prob(B/not-E)

         So the conclusion, accordingly, is that if there can be basic laws of co-existence, then there can be cases of open forks involving common effects that obtain, not by accident, but in virtue of laws of nature.

 6.2.4  Simple, Deterministic, Temporally Symmetric Worlds

         The next objection to the present probabilistic analysis of causation applies to any reductionist account of a Humean sort, and the basic idea is this.  On the one hand, the actual world is a complex one, with a number of features that might be invoked as the basis of a reductionist account of the direction of causation.  For, first of all, the direction of increase in entropy is the same in the vast majority of isolated or quasi-isolated systems (Reichenbach, 1956, pp. 117-43, and Adolf Grünbaum, 1973, pp. 254-64).  Secondly, the temporal direction in which order is propagated -- such as by the circular waves that result when a stone strikes a pond, or by the spherical wave fronts associated with a point source of light -- is invariably the same (Karl Popper 1956, p. 538 ).  Thirdly, it is also a fact that all, or virtually all, open forks are open in the same direction -- namely, towards the future (Reichenbach, 1956, pp. 161-3, and Wesley Salmon, 1978, p. 696).

         On the other hand, causal worlds that are much simpler than our own, and that lack such features, are surely possible.  In particular, consider a world that contains only a single particle, or a world that contains no fields, and nothing material except for two spheres connected by a rod, that rotate endlessly about one another, on circular trajectories, in accordance with the laws of Newtonian physics.  In the first world, there are causal connections between the temporal parts of the single particle.  In the second world, each sphere will undergo acceleration of a constant magnitude, due to the force exerted on it by the connecting rod.  So both worlds certainly contain causal relations.  But both worlds are also utterly devoid of changes of entropy, of propagation of order, and of all causal forks, open or otherwise.  The probabilistic analysis that we are considering, however, defines the direction of causation in terms of open forks.  Simple worlds such as those just mentioned show, therefore, that that probabilistic analysis cannot be sound.

         But what if the advocate of such an analysis responded by challenging the claim that such worlds contain causation?  In the case of the rotating spheres world, this could only be done by holding that it is logically impossible for Newton's Second Law of Motion to be a causal law, while in the case of the single particle world, one would have to hold that identity over time is not logically supervenient upon causal relations between temporal parts.  But both of theses claims, surely, are very implausible.

         In addition, however, such a challenge would also involve a rejection of the following principle:

 The Intrinsicness of Causation in a Deterministic World

If C1 is a process in world W1, and C2 a process in world W2, and if C1 and C2 are qualitatively identical, and if W1 and W2 are deterministic worlds with exactly the same laws of nature, then C1 is a causal process if and only if C2 is a causal process.

         For consider a world that differs from the world with the two rotating spheres by having additional objects that enter into causal interactions, and one of which collides with one of the spheres at some time t.  In that world, the process of the spheres rotating around one another during some interval when no object is colliding with them will be a causal process.  But then, by the above principle, the rotation of the spheres about one another, during an interval of the same length, in the simple universe, must also be a causal process.

         But is the Principle of the Intrinsicness of Causation in a Deterministic World correct?  Some philosophers have claimed that it is not.  In particular, it has been thought that a type of causal situation to which Jonathan Schaffer (2000, pp. 165-81) has drawn attention -- cases of 'trumping preemption' -- show that the above principle must be rejected.

         Here is a slight variant on a case described by Schaffer.  Imagine a magical world where, first of all, spells can bring about their effects via direct action at a temporal distance, and secondly, earlier spells prevail over later ones.  At noon, Merlin casts a spell to turn a certain prince into a frog at midnight -- a spell that is not preceded by any earlier, relevant spells.  A bit later, Morgana also casts a spell to turn the same prince into a frog at midnight.  Schaffer argues, in a detailed and convincing way, that the simplest hypothesis concerning the relevant laws entails that the prince's turning into a frog is not a case of causal overdetermination: it is a case of preemption.

     It differs, however, from more familiar cases of preemption, where one causal process preempts another by preventing the occurrence of some event that is crucial to the other process.  For in this action-at-a-temporal-distance case, both processes are fully present, since they consist simply of the casting of a spell plus the prince's turning into a frog at midnight.

         A number of philosophers, including David Lewis (2000), have thought that the possibility of trumping preemption shows that the Principle of the Intrinsicness of Causation in a Deterministic World is false, the idea being that there could be two qualitatively identical processes, one of which is causal and the other not.  For example, at time t1, Morgana casts a spell that a person turn into a frog in one hour's time at a certain location  That person does turn into a frog, because there was no earlier, relevant spell.  At time t2, Morgana casts precisely the same type of spell.  The person in question does turn into a frog, but the cause of this was not Morgana's spell, but an earlier, preempting spell.

         Is this a counterexample to the Intrinsicness Principle?  The answer is that it is not.  Causes are states of affairs, and the state of affairs that, in the t1 case, causes the person to turn into a frog is not simply Morgana's casting of the spell:  it is that state of affairs together with the absence of earlier, relevant spells.  So when the complete state of affairs that is the cause is focused upon, the two spell-casting cases are not qualitatively identical.  Trumping preemption is therefore not a counterexample to the Principle of the Intrinsicness of Causation in a Deterministic World.

 6.2.5  Simple, Probabilistic, Temporally Non-Symmetric Worlds

         The two simple, possible worlds mentioned in the preceding section were deterministic worlds, and they were also worlds that, as regards non-causal states of affairs, were precisely the same in both temporal directions.  Because of the latter property, they are counterexamples to any Humean, reductionist analysis of causation.  For given the complete temporal symmetry, there cannot be any Humean feature that will serve to pick out one of the two temporal directions as the direction of causation.

         That complete temporal symmetry also meant, however, that there is no evidence in such worlds as to what the direction of causation is, and, for those with verificationist tendencies, this will be viewed as a reason for denying that there is any direction to causation in those worlds.  What I now want to do, accordingly, is to show that there are other simple worlds that are equally counterexamples to Humean, reductionist analyses of causation, but that are not temporally symmetric, and that, because of the precise way in which they are asymmetric, are worlds that contain very strong evidence concerning the likely direction of causation.

         Consider a world that contains states of affairs of types S0(x, t), S1(x, t), S2(x, t), S3(x, t), . . . Sn(x, t), which are as follows.  First, S0(x, t) is a states of affair in which absolutely nothing exists at  location x at time t.  Secondly, if i is odd, Si(x, t), consists of 2i atomic elements of type A that are equally spaced on a circle of radius r, while if i is even, Si(x, t), consists of 2i atomic elements of type B that are equally spaced on a circle of radius r, .  So, leaving aside the circular arrangement of elements, the first few states of affairs are as follows:

 S0(x, t):         Nothing at all

 S1(x, t):         A  A

 S2(x, t):         B  B  B  B

 S3(x, t):         A  A  A  A  A  A  A  A

 S4(x, t):         B  B  B  B  B  B  B  B  B  B  B  B  B  B  B  B

 S5(x, t):         A  A  A  A  A  A  A  A  A  A  A  A  A  A  A  A A  A  A  A  A  A  A  A   A  A  A A  A  A  A  A

         Consider, now, the following two possible laws:

 L1:  For every region x, and every time t, if there is a state of affairs of type Si(x, t), where i is greater than 0, and less than n, that state of affairs will continue to exist until it has existed for a temporal interval of length d, at which point it will be replaced by a state of affairs of type Si+1(x, t*), where the spatial orientation of the latter state of affairs with respect to that of the temporally preceding one is completely random, while, if there is a state of affairs of type Sn(x, t), that state of affairs will continue to exist until it has existed for a temporal interval of length d, at which point it will be replaced by a state of affairs of type S0(x, t*).

 L2:  For every region x, and every time t, if there is a state of affairs of type Si(x, t), where i is greater than 0, that state of affairs will continue to exist until it has existed for a temporal interval of length d, at which point it will be replaced by a state of affairs of type Si-1(x, t*), where the spatial orientation of the latter state of affairs with respect to that of the temporally preceding one is completely random.

         Why have I specified that it is a completely random matter how successive states of affairs are spatially oriented relative to one another?  The answer is that this has been done to make it impossible, given the present account of causation, for there to be any causal forks in any world whose only law is either L1 or L2.  For consider the transition from S1 to S2.  If the relative spatial orientation of S1 and S2 is a random matter, then there is nothing that can make it the case, given the present account, that one of the two A elements in the state of type S1 is causally related to two specific B elements in the succeeding S2 state.  All that one will be able to say is that the one total state of affairs causes the other total state of affairs, and because one cannot break this down into relations between parts of one and parts of the other, no causal forks will exist.

         Suppose now that T1 and T2 are two types of worlds, each with the same, very large number of spatial locations.  Suppose, further, that L1 is the only law in worlds of type T1, and that L2 the only law in worlds of type T2, and that in worlds of type T1,  a state of affairs of type S1(x, t) sometimes pops into existence, completely uncaused, in vacant regions of sufficient size, while, in worlds of type T2, a state of affairs of type Sn(x, t) sometimes pops into existence, completely uncaused, in vacant regions of sufficient size.

         Suppose, finally, that W is a world that is either of type T1 or of type T2.  As we have seen, because it is a completely random matter how successive states of affairs are spatially oriented relative to one another, there cannot be, given the probabilistic analysis of causation that we are now considering, any forks in world W -- and, a fortiori, any open forks.  It therefore follows, on this analysis of causation, that there is no direction of causation, and so no causation, in world W.

         But this conclusion is unsound.  The information that one has about the world makes it very likely that there is causation in world W, and that it has a certain direction.  For compare worlds of type T1 with worlds of type T2.  In worlds of the former sort, the only type of state of affairs that comes into existence uncaused is a state of affairs of type S1(x, t), and since this consists of only two atomic elements of type A, it is not especially unlikely that such a state of affairs should come into existence uncaused.  By contrast, in worlds of type T2, the type of state of affairs that comes into existence uncaused is a state of affairs of type Sn(x, t), and since this may very well consist of an enormous number of atomic elements -- since n can be any number one wants, such as 10100 -- all of them of the same type, equally spaced on a circle, it may, by contrast, be extraordinarily unlikely that such a state of affairs should come into existence uncaused.

         The upshot, in short, is that given a world W that is either of type T1 or of type T2, it is much more likely that W is of type T1 than of type T2, and so it is much more likely that the direction of causation runs from states of affairs of type S1(x, t) to type S2(x, t) and on to type Sn(x, t), than that it runs in the opposite direction.

         Finally, though worlds of types T1 and T2 do involve laws that are not completely probabilistic, since the temporal interval at which one state of affairs is replaced by another is fixed, that it not essential, and one could replace laws L1 and L2 by totally probabilistic laws in which each of the relevant states of affairs has a certain half-life, so that there would merely be a certain probability that a given state of affairs would, within a given temporal interval, be replaced by the next state in the relevant order.  The resulting world types - T1* and T2* - would then be completely probabilistic worlds, but that would not alter the fact that it would be much more likely that the direction of causation was from states of affairs of type S1(x, t) to states of affairs of type S2(x, t) and on to states of affairs of type Sn(x, t), rather than in the opposite direction.

         The conclusion, accordingly, is that there are simple, probabilistic worlds in which causation is present, and in which there is good reason for viewing one of the two possible temporal directions as the direction of causation, but where the probabilistic analysis of causation that we are considering mistakenly entails that no causation is present.

 6.2.6  Temporally 'Inverted', Twin Universes

         It is the year 4004 B.C.  A Laplacean-style deity is about to create a world rather similar to ours, but one where Newtonian physics is true.  Having selected the year 3000 A.D. as a good time for Armageddon, the deity works out what the world will be like at that point, down to the last detail.  He then creates two spatially unrelated worlds:  the one just mentioned, together with another whose initial state is a flipped-over version of the state of the first world immediately prior to Armageddon - i.e., the two states agree exactly, except that the velocities of the particles in the one state are exactly opposite to those in the other.

         Consider, now, any two complete temporal slices of the first world, A and B, where A is earlier than B.  Since the worlds are Newtonian ones, and since the laws of Newtonian physics are invariant with respect to time reversal, the world that starts off from the reversed, 3000 A.D. type state will go through corresponding states, B* and A*, where these are flipped-over versions of B and A respectively, and where B* is earlier than A*.  So while the one world goes from a 4004 B.C., Garden of Eden state to a 3000 A.D., pre-Armageddon state, the other world will move from a reversed, pre-Armageddon type of state to a reversed, Garden of Eden type of state.

         In the first world, the direction of causation will coincide with such things as the direction of increase in entropy, the direction of the propagation of order in non-entropically irreversible processes, and the direction defined by most open forks.  But in the second world, where the direction of causation runs from the initial state created by the deity -- that is, the flipped-over 3000 A.D. type of state -- through to the flipped-over 4004 B.C. type of state, the direction in which entropy increases, the direction in which order is propagated, and the direction defined by open forks will all be the opposite one.  So if any of the latter is used to define the direction of causation, it will generate the wrong result in the case of the second world.  The probabilistic analysis of causation that we are presently considering assigns, therefore, the wrong direction to causation in the case of the second world.

 6.2.7  Causally Ambiguous Situations in Probabilistic Worlds

         A reductionist analysis of causation in terms of relative frequencies is also exposed to a variety of 'underdetermination' objections, the thrust of which is that fixing all of the non-causal properties of, and relations between, events, including all relative frequencies, does not always suffice to fix what causal relations there are between events.  Indeed, the arguments in question support much stronger conclusions -- such as, for example, the conclusion that even if one also fixes what laws there are, both causal and non-causal, along with the direction of causation for all possible causal relations that might obtain, that still does not suffice to settle what causal relations there are between events.

         One such argument can be set out as follows.   First, one needs to ask whether statements of causal laws can involve the concept of causation.  Consider, for example, the following statement:  "It is a law that for any object x, the state of affairs that consists of x's having property F causes a state of affairs that consists of x's having property G."  Is this an acceptable way of formulating a possible causal law?

         Some philosophers contend that it is not, and that the correct formulation is, instead, along the following lines:

 (*)  "It is a causal law that for any object x, if x has property F at time t, then x has property G at (t + *t)."

         But what reason is there for thinking that it is the latter type of formulation that is correct?  Certainly, as regards intuitions, there is no reason why there should not be laws that themselves involve the relation of causation.  But in addition, the above claim is open to the following objection.  First, the following two statements are logically equivalent:

 (1)  For any object x, if x has property at time t, then x has property G at (t + *t);

 (2)  For any object x, if x lacks property G  at time (t + *t), then x lacks property F at t.

         Now replace the occurrence of (1) in (*) by an occurrence of (2), so that one has:

 (**)  "It is a causal law that for any object x, if x lacks property G  at time (t + *t), then x lacks property F at time t."

         The problem now is that it may very well be the case that while (*) is true, (**) is false, since its being a causal law that for any object x, if x has property at time t, then x has property G at (t + *t) certainly does not entail that there is a backward causal law to the effect that for any object x, if x lacks property G  at time (t + *t), then x lacks property F at t.  So anyone who holds that (*) is the correct way to formulate causal laws needs to explain why substitution of logically equivalent statements in the relevant context does not preserve truth.

         By contrast, no such problem arises if one holds that causal laws can instead be formulated as follows:

 It is a law that for any object x, the state of affairs that consists of x's having property F at time t causes a state of affairs that consists of x's having property G at time (t + *t).

         Let us assume, then, that the natural way of formulating causal laws is acceptable.  The next step in the argument involves the assumption that probabilistic laws are logically possible.  Given these two assumptions, the following presumably expresses a possible causal law:

 L1:  It is a law that, for any object x, x's having property P for a time interval *t causally brings it about, with probability 0.75, that x has property Q.

         The final crucial assumption is that it is logically possible for there to be uncaused events.

         Given these assumptions, consider a world, W, where objects that have property P for a time interval *t go on to acquire property Q 76 percent of the time, rather than 75 percent of the time, and that this occurs even over the long term.  Other things being equal, this would be grounds for thinking that the relevant law was not L1, but rather:

 L2:  It is a law that, for any object x, x's having property P for a time interval *t causally brings it about, with probability 0.76, that x has property Q.

         But other things might not be equal.  In the first place, it might be the case that L1 was derivable from a very powerful, simple, and well-confirmed theory, whereas L2 was not.  Secondly, one might have excellent evidence that there were totally uncaused events involving objects' acquiring property Q, and that the frequency with which that happened was precisely such as would lead to the expectation, given law L1, that situations in which an object had property P for a time interval *t would be followed by the object's acquiring property Q 76 percent of the time.

         If that were the case, one would have reason for believing that, on average, over the long term, of the 76 cases out of a 100 where an object that has property P for *t and then acquires property Q, 75 of those cases will be ones where the acquisition of property Q is caused by the possession of property P, while one out of the 76 will be a case where property Q is spontaneously acquired.

         There can, in short, be situations where there would be good reason for believing that not all cases where an object has property P for an interval *t, and then acquires Q, are causally the same.  There is, however, no hope of making sense of this given a reductionist analysis of causation in terms of relative frequencies.  For the cases do not differ with respect to any non-causal properties and relations, including relative frequencies, nor with respect to causal or non-causal laws, nor with respect to the direction of causation in any potential causal relations.   So the present approach is unable to deal with such causally ambiguous, probabilistic situations.

 6.2.8  Causation Without Increase in Probability

         We have not yet considered the most fundamental claim involved not only in the attempt to analyze causation in terms of relative frequencies, but, indeed, in all probabilistic analyses of causation -- the proposition, namely, that causes always make their effects more likely, in some appropriate sense.  Is this claim true?  The answer appears to be that it is not, as even some philosophers who are sympathetic to the general idea that there is some connection between causation and probability -- such as Daniel Hausman (1998) -- have realized.  For consider the following.  Assume that there are atoms of type T that satisfy the following conditions:

 (1) Any atom of type T must be in one of the three mutually exclusive states - A, B, or C;

 (2) The probabilities that an atom of type T in states A, B, and C, respectively, will emit an electron are, respectively, 0.9, 0.7, and 0.2

 (3) The probabilities that an atom of type T is in state A is 0.5; in state B, 0.4; and in state C, 0.1.

         Now, given that, for example, putting an atom of type T into state B would be quite an effective means of getting it to emit an electron, it is surely true that, if it is in state B, and emits an electron, then its being in state B is a probabilistic cause of its emitting an electron.  But this would not be so if the above account were correct.  For if D is the property of emitting an electron, the unconditional probability that an atom of type T will emit an electron is given by Prob(D) = Prob(D/A) x Prob(A) + Prob(D/B) x Prob(B) + Prob(D/C) x Prob(C) = (0.9)(0.5) + (0.7)(0.4) + (0.2)(0.1) = 0.75.  But the conditional probability of D given B was specified as 0.7.  We have, therefore, that Prob(D) > Prob(D/B).  So if a cause had to raise the probability of its effect, it would follow that an atom of type T's being in state B could not be a probabilistic cause of its emitting an electron.  This, however, is unacceptable.  So the thesis that a cause must raise the probability of its effect, in the relevant sense, must be rejected.

         The thesis that causes necessarily make their effects more likely is exposed, therefore, to a decisive objection  The basis of this objection is the possibility of there being one or more other causal factors that are incompatible with the given factor, and more efficacious than it.  For, given such a possibility, events of type C may be the cause of events of type E even though the probability of an event of type E, given the occurrence of an event of type C, is less than the unconditional probability of an event of type E.

         But is there nothing, then, in the rather widely shared intuition that causation is related to increase in probability?  The answer is that causation may be related to increase in probability, but not in the way proposed by those who favor a probabilistic analysis of causation.  What this other way is will emerge in section 5.  The crucial point for present purposes, however, is that the relation in question cannot be used as part of a probabilistic analysis of causation, since the relation itself turns out to involve the concept of causation.