Monday, April 10, 2017

The St. Petersburg Paradox

Introduction

Imagine that you’re at a carnival, and there’s a game where you’re going to flip a coin. If tails comes up, you win two dollars, but if heads comes up, you win nothing. How much would you pay to play the game? Nothing? One dollar? Two dollars?

Mathematically speaking, you should pay any price less than or equal to one dollar to play this game. Here’s how it works.

The expected value of the game is the probability of each outcome, multiplied by the payoff for that outcome, all added together. So for this game, G, we have: 

$$E(G) = P(Tails)*(Payoff\  for\  Tails) + P(Heads)*(Payoff\  for\  Heads)$$
$$E(G) = 0.5*($2) + 0.5*($0) = $1.00$$

This means that if you pay one dollar to play the game, you should break even.
If you pay less than a dollar, then you expect to win money, and if you pay more than a dollar, expect to lose money.

If you have questions about how expected value works for games like this, please leave me a note in the comments.

So how does this relate to the St. Petersburg Paradox?

First things first, this paradox has nothing to do with Russia. It's called the St. Petersburg Paradox because the guy who discovered it published it in a Russian journal called the St. Petersburg Academy Proceedings.

Here's how the game works. You are going to flip a coin, until it comes up Tails. As soon as Tails comes up, the game is over. If it comes up Tails on the first flip, you win $2, if it comes up Tails on the second flip, you win $4, the third time $8, the fourth time $16, and so on.

How much would you pay to play this game? Leave your answer in the comments below (and don't cheat by looking at the solution below!!!)

Let's calculate how much you should pay. The expected value we will calculate just like we did before.

$$E(G) = P(Tails\ on\ Flip\ 1)*($2) + P(Heads\  on\ Flip\ 1,\ with\ Tails\ on\ Flip\ 2)*($4)\\ + P(Heads\ on\ Flip\ 1\ and\ 2,\ with\ Tails\ on\ Flip\ 3)*($8) + ...$$

$$E(G) = \frac{1}{2}*($2) + \frac{1}{4}*($4) + \frac{1}{8}*($8) + ...$$

$$E(G) = $1 + $1 + $1 + ... = \infty$$

So by this math, the amount you should be willing to pay is... infinity? If you think strictly mathematically about how much you should pay, the answer is yes. If someone wants to pay this game, you should agree to pay however much they ask. However, this is called a paradox because most of us wouldn't be willing to pay even $25, according to Ian Hacking.

How do we reconcile this then? How do we explain our reluctance to pay large amounts of money to play the game, when the math says we should win it all back and then some?

It's because, by nature, humans are "Risk-Averse". Thinking about this problem another way, we have a 50% chance of only winning two dollars, a 75% chance of winning four or less, and the chances of us winning more than 25 dollars is less than 4%. This explains our intuition to not pay lots of money for this game.

So how can we mathematically represent this?

Solution 1: The Utility Function

The utility function, proposed by Bernoulli, assumes that marginal utility (the utility obtained from consuming an extra increment of the good) decreases as the quantity consumed increases. Basically, winning $2,048 doesn't make us feel more satisfied than winning $1,024. This doesn't seem quite right considering we're winning money, but it's a good concept to apply to prevent our expected value from going to infinity. We can use the logarithm function to calculate the utility we get for each outcome, and find that: 


Using this utility function, we find that the expected utility sums to a limit of about 0.6 utils, or $4.00, meaning that you should be willing to pay about $4 to play the game. This seems more reasonable.

However, we can mess with this by increasing our prize. The following table shows that even using the log function, we can make the paradox reappear. (Damn it).


Solution 2: Start applying limitations to the game

There are several aspects of the game that make it unrealistic in real-life scenarios. Anyone want to list some in the comments?

  • Casino's don't have an unlimited amount of money. They may decide to stop the game after the player has flipped a given amount of times.
  • The players are not infinitely patient. If you had to sit there and flip a coin 20 times, or 100 times, or 1000 times, eventually you'll get bored. And that's not to mention that since the game assumes you can flip infinitely many times, eventually you'll die playing this game. Yikes.
But then, when we impose these limitations, are we really playing the St. Petersburg game? Or a variation of it, in which case these aren't solutions, but another problem entirely? It's an open debate.


When you start applying these limitations, we find that the expected value of the game works out as follows, based on how much money the banker/casino has to pay out:


We can see in the above table that even if a casino had as much money as Bill Gates, 79 billion dollars, we should only expect to win $37.15. This seems much more reasonable.

Conclusion:

This game is paradox, because in spite of the fact that we should be willing to pay any amount to play, no one actually is. There have been several proposed solutions, including applying realistic restrictions to the game, or using a utility function, both of which attempt to limit the expected value of the game. In the search for the solution, techniques such as the marginal utility function have been developed, which play a large role in economic theory even today. The St. Petersburg paradox is studied in decision theory in economics as well as in game theory, and is an interesting example of how intuition and probability can clash.


Bibliography:

Martin, Robert, "The St. Petersburg Paradox", The Stanford Encyclopedia of Philosophy (Summer 2014 Edition), Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/sum2014/entries/paradox-stpetersburg/>.

"St. Petersburg Paradox." Wikipedia. Wikimedia Foundation, 08 Apr. 2017. Web. 10 Apr. 2017.




6 comments:

  1. I tested out a small example and supposed that I paid $20 to play the coin game. I tested/played the game many times to see if I could win my money back and/or make a profit. It turns out, I wasn't so lucky. Since I paid $20, I would need to see at least 5 heads in a row in order to make a profit ($32-$20 = $12 minimal profit amount). After many failed attempts, I was able to come close and got 4 heads in a row but that only gave me $16 so I would have still lost $4.

    ReplyDelete
  2. Thinking about how much I would pay to play this game I tried to think of what the expected value would be. But when trying to find the expected value I came up with (0.5)($2)+(0.5)(0.5)($4)+(0.5)(0.5)(0.5)($8)...=$1+$1+$1...

    So in short i don't know how much I would pay because I'm certainly not paying infinite amounts of dollars. However if I don't think about expected values or any mathematical strategy at all and someone presented this game to me I probably wouldn't pay more than $5.

    ReplyDelete
    Replies
    1. Isn't there also a chance on not landing on a tail for a long time. Let's say out of 50 flips, you don't get to tails until the 25th flip. So, this might be why you are apprehensive to play this game

      Delete
    2. That's a good point. I think my strategy would change if I played once versus playing lets say 100 times.

      Delete
    3. I also thought about that. Not sure if I would want to take the chance of going negative in cash. However, I think it would be interesting to look at how risk aversion changes the further people go in the game...potential gamblers fallacy territory as well.

      Delete
  3. Nice post Madison - my only question came up during the Bill Gates example, can we talk about that one more time tomorrow? Thanks!

    ReplyDelete

Note: Only a member of this blog may post a comment.