A Critique of the Dutch Book Argument
Many neutral observers concur into thinking we are assisting to the formation of a new religion among hopelessly nerdy people.
I’m thinking of course on what has been called hardcore Bayesianism, the epistemology according to which each proposition (“Tomorrow it’ll rain”, “String theory is the true description of the world”, “There is no god” etc.) has a probability which can and should be computed under almost every conceivable circumstance.
In a previous post I briefly explained the two main theories of probabilities, frequentism and Bayesianism. In another post, I laid out my own alternative view called “knowledge-dependent frequentism” which attempts at keeping the objectivity of frequentism while including the limited knowledge of the agent. An application to the Theory of Evolution can be found here.
While trying to win new converts, they often put it as if it were all about accepting Bayes theorem whose truth is certain since it has been mathematically proven. This is a tactic I’ve seen Richard Carrier repeatedly employing.
I wrote this post as a reply for showing that frequentists accept Bayes theorem as well, and that the matter of the dispute isn’t about its mathematical demonstration but about whether or not one accepts that for every proposition, there exists a rational degree of belief behaving like a probability.
Establishing the necessity of probabilistic coherence
One very popular argument aiming at establishing this is the “Dutch Book Argument” (DBA). I think it is no exaggeration to state that many committed Bayesians venerate it with almost the same degree of devotion a Conservative Evangelical feels towards the doctrine of Biblical inerrancy.
Put forward by Ramsey and De Finetti, it defines a very specific betting game whose participants are threatened by a sure loss (“being Dutch booked”) if the amounts of their odds do not fulfill the basic axioms of probabilities, the so-called Kolmogorov’s axioms (I hope my non-geeky readers will forgive me one day for becoming so shamelessly boring…):
1) the probability of an event is always a real positive number
2) the probability of an event regrouping all possibilities is equal to 1
3) the probability of the sum of disjoint events is equal to the sum of the probability of each event
The betting game upon which the DBA lies is defined as follows: (You can skip this more technical green part whose comprehension isn’t necessary for following the basic thrust of my criticism of the DBA).
A not very wise wager
Let us consider an event E upon which it must be wagered.
The bookmaker determines a sum of money S (say 100 €) that a person R (Receiver) will get from a person G (Giver) if E comes true. But the person R has to give p*S to the person G beforehand.
The bookmaker determines himself who is going to be R and who is going to be G.
Holding fast to these rules, it’s possible to demonstrate that a clever bookmaker can set up things in such a way that any better not choosing p respecting the laws of probabilities will lose money regardless of the outcome of the event.
Let us consider for example that a better wagers upon the propositions
1) “Tomorrow it will snow” with P1 = 0.65 and upon
2) “Tomorrow it will not snow” with P2 = 0.70.
P1 and P2 violate the laws of probability because the sum of the probabilities of these two mutually exclusive events should be 1 instead of 1.35
In this case, the bookmaker would choose to be G and first get P1*S + P2*S = 100*(1.135) = 135 € from his better R. Afterwards, he wins in the two cases:
– It snows. He must give 100 € to R because of 1). The bookmaker’s gain is 135 € – 100 = 35 €
– It doesn’t snow. He must give 100 € to R because of 2). The bookmaker’s gain is also 135 € – 100 = 35 €
Let us consider the same example where this time the better comes up with P1 = 0.20 and P2 = 0.3 whose sum is largely inferior to 1.
The Bookmaker would choose to be R giving 0.20*100 = 20 € about the snow and 0.3*100 = 30 € about the absence of snow. Again, he wins in both cases:
– It snows. The better must give 100 € to R (the bookmaker) because of 1). The bookmaker’s gain is -30 – 20 +100 = 50 €
– It does not snows. The better must give 100 € to R (the bookmaker) because of 2). The bookmaker’s gain is -30 – 20 +100 = 50 €
In both cases, P1 and P2 having fulfilled the probability axioms would have been BOTH a necessary and sufficient condition for keeping the sure loss from happening.
The same demonstration can be generalized to all other basic axioms of probabilities.
The thrust of the argument and its shortcomings
The Dutch Book Argument can be formulated as follows:
1) It is irrational to be involved in a bet where you’re bound to lose
2) One can make up a betting game such that for every proposition, you’re doomed to lose if the sums you set do not satisfy the rules of probabilities. In the contrary case you’re safe.
3) Thus you’d be irrational if the amounts you set broke the rules of probabilities.
4) The amounts you set are identical to your psychological degrees of belief
5) Hence you’d be irrational if your psychological degrees of beliefs do not behave like probabilities
Now I could bet any amount you wish there are demonstrably countless flaws in this reasoning.
I’m not wagering
One unmentioned premise of this purely pragmatic argument is that the agent is willing to wager in the first place. In the large majority of situations where there will be no opportunity for him to do so, he wouldn’t be irrational if his degrees of beliefs were non-probabilistic because there would be no monetary stakes whatsoever.
Moreover, a great number of human beings always refuse to bet by principle and would of course undergo no such threat of “sure loss”.
Since it is a thought experiment, one could of course modify it in such a way that:
“If you don’t agree to participate, I’ll bring you to Guatemala where you’ll be water-boarded until you’ve given up”.
But to my eyes and that of many observers, this would make the argument look incredibly silly and convoluted.
I don’t care about money
Premise 1) is far from being airtight.
Let us suppose you’re a billionaire who happens to enjoy betting moderate amounts of money for various psychological reasons. Let us further assume your sums do not respect the axioms of probabilities and as a consequence you lose 300 €, that is 0.00003% of your wealth while enjoying the whole game. One must use an extraordinarily question-begging notion of rationality for calling you “irrational” in such a situation.
Degrees of belief and actions
It is absolutely not true that our betting amounts HAVE to be identical or even closely related to our psychological degree of beliefs.
Let us say that a lunatic bookie threatens to kill my children if I don’t accept to engage in a series of bets concerning insignificant political events in some Chinese provinces I had never heard of previously.
Being in a situation of total ignorance, my psychological degree of beliefs are undefined and keep fluctuating in my brain. But since I want to avoid a sure loss, I make up amounts behaving like probabilities which will prevent me from getting “Dutch-booked”, i.e. amounts having nothing to do with my psychology.
So I avoid sure loss even if my psychological states didn’t behave like probabilities at any moment.
Propositions whose truth we’ll never discover
There are countless things we will never know (at least assuming atheism is true, as do most Bayesians.)
Let us consider the proposition: “There exists an unreachable parallel universe which is fundamentally governed by a rotation between string-theory and loop-quantum gravity“ and many related assertions.
Let us suppose I ask to a Bayesian friend: “Why am I irrational if my corresponding degrees of belief in my brain do not fulfill the basic rules of probability?”
The best thing he could answer me (based on the DBA) would be:
“Imagine we NOW had to set odds about each of these propositions. It is true we’ll never know anything about that during our earthly life. But imagine my atheism was wrong: there is a hell, we are both stuck in it, and the devil DEMANDS us to abide by the sums we had set at that time.
You’re irrational because the non-probabilistic degrees of belief you’re having right now means you’ll get dutch-booked by me in hell in front of the malevolent laughters of fiery demons.”
Now I have no doubt this might be a good joke for impressing a geeky girl being not too picky (which is truly an extraordinarily unlikely combination).
But it is incredibly hard to take this as a serious philosophical argument, to say the least.
A more modest Bayesianism is probably required
To their credits, many more moderate Bayesians have started backing away from the alleged strength and scope of the DBA and state instead that:
“First of all, pretty much no serious Bayesian that I know of uses the Dutch book argument to justify probability. Things like the Savage axioms are much more popular, and much more realistic. Therefore, the scheme does not in any way rest on whether or not you find the Dutch book scenario reasonable. These days you should think of it as an easily digestible demonstration that simple operational decision making principles can lead to the axioms of probability rather than thinking of it as the final story. It is certainly easier to understand than Savage, and an important part of it, namely the “sure thing principle”, does survive in more sophisticated approaches.”
Given that Savage axioms rely heavily on risk assessment, they’re bound to be related to events very well treatable through my own knowledge-dependent frequentism, and I don’t see how they could justify the existence and probabilistic nature of degree of beliefs having no connection with our current concerns (such as the evolutionary path through which a small sub-species of dinosaurs evolved countless years ago).
To conclude, I think there is a gigantic gap between:
– the fragility of the arguments for radical Bayesianism, its serious problems such as magically turning utter ignorance into specific knowledge.
– the boldness, self-righteousness and terrible arrogance of its most ardent defenders.
I am myself not a typical old-school frequentist and do find valuable elements in Bayesian epistemology but I find it extremely unpleasant to discuss with disagreeable folks who are much more interested in winning an argument than in humbly improving human epistemology.
Thematic list of ALL posts on this blog (regularly updated)