Project Euler Problem 436 - Unfair Wager
Spoiler Alert! This blog entry gives away the solution to problem 436 of Project Euler. Please don’t read any further if you have yet to attempt to solve the problem on your own. The information is intended for those who failed to solve the problem and are looking for hints or test data to help them track down bugs. It is posted not before the problem has a hundred solvers already.
First I provide references of the major theoretical background that you probably need to know to solve the problem. The last section presents the solution from start to finish.
The post reflects my approach to the problem. Even though the final outcome was accepted by Project Euler doesn’t mean the information is correct or elegant. Algorithms won’t necessarily be the most efficient ones, but they are guaranteed to run within the time limit of one minute.
References
This is a pen and paper problem that requires very little knowledge of maths. You mainly need to calculate conditional probabilities and use the distribution of the sum of uniform variables.
Notation
I’ll use the following notation:
- ui: The ith uniform number.
- si: The sum after i draws
- n: The number of draws to get the sum over 1
- m: The number of draws to get the sum over 2
- p: The first sum over 1, i.e. p=sn
- x: The number that the first player gets.
- y: The number that the second player gets.
The capital letters denote the corresponding random variables and fX(x) denotes the probability density function of X at x .
Solution
I use the following steps to solve the problem:
- Sum after i picks
- First Player – Distribution of x
- Second Player – Distribution of y
- Joint distribution of x and y
- Probability of y bigger than x
Sum after i picks
Let
Si(s)=i∑1UiUi∼U(0,1)
The probability density function (pdf) for the sum after one step corresponds to the pdf of the continuous uniform distribution:
fS1(s)=1[0,1]
where 1A denotes the indicator function.
For each subsequent pick, the sum Sn is obtained by adding a uniform random variable to Sn−1 . The uniform random variable is independent of the sum Sn−1 , and therefore the pdf of Sn is obtained by convolution of fSn−1 and fU :
fSn(s)=fSn−1+U(s)=(fSn−1∗fU)(s)=∫∞−∞fSn−1(t)⋅fU(s−t)dt=∫∞0fSn−1(t)⋅1dt
By induction we get
fSn(s)={sn−1(n−1)!0≤s≤1⋯⋯
(We will only need the distribution below one).
First Player – Distribution of x
Let n denote the number of picks it takes to get the sum over 1. Consequently, sn is the first sum over 1, i.e., the sum where the second player takes over. The last summand of sn is the number x(n) that the first player gets if she draws n numbers:
sn−1+x(n)=snsn−1≤1sn>1
Because
sn−1=sn−x(n)
and because sn−1≤1 and sn>1 , we know that
1<sn≤1+x(n)
so the joint probability density distribution of x(n) and Sn is
fX(n),Sn(x,s)={fSn−1(s−x)1<s≤1+x0otherwise={(s−x)n−2(n−2)!1<s≤1+x0otherwise
The previous density applies to a given n . Let x be the number of the first player, and let p be the first sum over 1. To get the unconditional probability fP,X, we sum up fX(n),Sn(x,s) (2) over all potential n :
fP,X(p,x)=∞∑n=2(p−x)n−2(n−2)!1<p≤1+x=∞∑n=0(p−x)nn!1<p≤1+x=ep−x1<p≤1+x
Second Player – Distribution of y
The first player leaves the sum at p an the second player needs to get the sum over 2. I prefer to shift the values to the point of origin: Let the second player start at 0 and let y be the first value over 2−p . Also let s′i be the sum of the first i number that the second player picked. In other words, if the first player picked n numbers that add up to sn , and the second player picks m numbers that bring the total to sn+m , then s′m=sn+m−sn .
We already know the distribution of s′i below 2−p from the first player from Equation 1:
$$ \begin{equation} f_{S’i}(s) = \frac{s^{n-1}}{(n-1)!} \mathbf{1}{[0,2-p]} \end{equation} $$
To get to the distribution of y we have to distinguish two fundamentally different cases.
Case 1: y is less or equal than 2-p
If y is smaller than 2−p , then the second player needed at least two picks to get the sum over 2−p , and the penultimate sum is at greater than 2−p−y (since adding y brings the sum over 2−p ) and at most 2−p (because if the sum would be bigger than that, it would be the last sum already).
Assume that the second player needs m>1 picks and his last number is y(m) . Therefore, we have
fY(m)|P,y>2−p,m≥2(y)=∫2−p2−p−yfSm−1ds=∫2−p2−p−ysn−2(s−2)!ds=sm−1(m−1)!|2−p2−p−y=(2−p)m−1(m−1)!−(2−p−y)m−1(m−1)!
Hence, the pdf of y is:
fY|P,y>2−p(y)=∞∑m=2(2−p)m−1(m−1)!−(2−p−y)m−1(m−1)!=∞∑m=1(2−p)mm!−∞∑m=1(2−p−y)mm!=e2−p−e2−p−y
Case 2: y is larger than 2-p
The second case is special, because this time the second player has a chance to get her number with the first pick. We can cover this case by defining
fS0(s):=δ(s)
which means the sum of zero uniform random value is always 0 and the corresponding pdf is 1 at 0, and 0 for any other value.
Secondly, the lower bound of the integration needs to be adjusted: since 2−p−y is always smaller than zero, and the pdfs of Sn are zero for negative values, we start integrating at 0.
The rest of the calculation is the same, except this time we include m=1 :
As before, let’s assume that the second player needs m>1 picks and his last number is y(m) . Therefore, we have
fY(m)|P,y≤2−p,m≥1(y)=∫2−p0fSm−1ds={∫2−p0sn−2(s−2)!dsm≥2∫2−p0δ(s)dsm=1=1+(sm−1(m−1)!|2−p0)=(2−p)m−1(m−1)!
Hence, the pdf of y is:
fY|P,y≤2−p=∞∑m=1(2−p)m−1(m−1)!=∞∑m=0(2−p)mm!=e2−p
So all in all the pdf of y given P is:
fY|P={e2−p−e2−p−yy<2−pe2−py≥2−p=e2−p−e2−p−y⋅1[0,2−p[(y)
Joint distribution of x and y
The joint distribution of the picks of the first and second player is:
fX,Y(x,y)=∫1+x1fX,P(x,p)fY|P(y|p)dp=∫1+x1ep−x⋅(e2−p−e2−p−y⋅1[0,2−p[(y))dp
Since L≤1+x we have
2−p≤2−(1+x)=1−x
The indicator is therefore always one if y<1−x and
fX,Y|Y<1−X(x,y|y<1−x)=∫1+x1ep−x(e2−p−e2−p−y⋅)dp=e2−x−y(−1+ey)p|1−x1=e2−x−y(−1+ey)x
If, on the other hand, x+y≥0 , then the indicator is only one if L<2−y :
fX,Y|Y≥1−X(x,y|y≥1−x)=∫1+x1ep−x(e2−p)dp+,,,∫2−y1ep−x(−e2−p−y)dp=e2−xx−e2−x−y(1−y)
And we have:
fX,Y(x,y)={xe2−x(1−e−y)x+y<1e2−x−y(xey−1+y)x+y≥1
Probability of y bigger than x
Player two wins if y greater than x :
P[y≥x]=∫10∫1xfX,Y(x,y)dydx
First, we tackle the case x+y<1 . Since also y>x we have 0≤x<1/2 and therefore:
P[y≥x|x+y<1]=∫1/20∫1−xx(e2−xx(1−e−y))dydx=18(5+40√e−26e)e
There are two possibilities for the case x+y≥1 :
- 0≤x≤1/2 and 1−x≤y≤1
- 1/2≤x≤1 and x≤y≤1
P[y≥x|x+y≥1]=∫1/20∫11−x(xe2−x−(1−y)e2−x−y)dydx,,,+∫11/2∫1x(xe2−x−(1−y)e2−x−y)dydx=14+23e8−5e3/2+2e2
So finally we have from Equations 3 and 4:
P[y≥x]=P[y≥x|x+y<1]+P[y≥x|x+y≥1]=18(5+40√e−26e)e+14+23e8−5e3/2+2e2=14(1+14e−5e2)
Archived Comments
Note: I removed the Disqus integration in an effort to cut down on bloat. The following comments were retrieved with the export functionality of Disqus. If you have comments, please reach out to me by Twitter or email.