resolving the St. Petersburg paradox

The St. Petersburg paradox is based on one of those gambling games where the usual model of using expected gain to decide whether to play the game gives a counter-intuitive result.

In the simplest of examples, you pay some entry fee to play the game, $1 is put in a pot by a counterparty, then a coin is repeatedly flipped and the pot is doubled on every coin flip by the counterparty, until “tail” comes up. You receive the money in the pot. The expected gain of this game is infinite, regardless of the initial entry fee. So it would seem that one should always play the game, regardless of the amount demanded as entry fee. But, as the article points out, “few of us would pay even $25 to enter such a game.”

(This seems to be one of many variations of the paradox.) The explanations given in the link to resolve the paradox aren’t satisfactory. “One can’t buy what isn’t sold” can only be considered a joke, while “expected utility” is somewhat plausible, but doesn’t strike at the central issue, because it can be circumvented with an equally counter-intuitive paradox fitted to the chosen utility function. In contrast to the Gambler’s ruin paradox, I don’t think that an artificial finite bound on the money supply (in this case, of the counterparty) makes sense as an explanation, but what it reveals as the logarithmic growth of the expected gain against the money supply and the general consequence that imposing some kind of finiteness may explain the paradox, is instructive.

Of course, the only way to get any gain is to actually play the game. If you repeatedly play the game, your gain does eventually go to infinity. So why would you be reluctant to pay even $25 to enter? It must be because those large pay-offs are so infrequent that to make the initial money back would take too long. Suppose the entry fee is \(W\). Suppose you call it a day when you have a positive pay-off. For that to happen in round \(n\), it must be true that

Input: \begin{eqnarray*}
2^{f_1} + 2^{f_2} + \dots + 2^{f_k} & < & kW,\  (k<n)\\
2^{f_1} + 2^{f_2} + \dots + 2^{f_n} & \geq & nW
\end{eqnarray*}

where \(f_i\) is the number of flips before tail comes up in round \(i\).

Let’s call n-tuples \((f_1, f_2, \dots, f_n)\) that satisfy the above by the set \(S_n\). The probability of winning in round \(n\) is then

Input: $$p_n \equiv \sum_{(f_1,\dots,f_n)\in S_n}  \left( \prod_{i=1}^n 2^{f_i+1} \right)^{-1} $$

from which we can surely get the average number of rounds it will take to win the game \(\sum_{i=1}^\infty n p_n\). If this is incredibly large even for modest \(W\), which is likely the case, then that would explain the paradox, since a game that on average takes longer than a lifetime to win would be played by no one.

No comments yet. Be the first.

Leave a reply