Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

The [0,1] game: Part 2

15 views
Skip to first unread message

Jerrod Ankenman

unread,
Jan 14, 2003, 11:05:10 PM1/14/03
to
The [0,1] game and Derivatives: A Study of Some Poker-Like Games
Bill Chen/Jerrod Ankenman

Part 2: Raising and Game #2

This is part 2 in a many-part investigation of what we will refer to as
the [0,1] game, which can in some ways be thought of as analogous to
poker and is actually interesting in its own right.

Recap to this point: In Part 1, we analyzed a very simple [0,1] game
where the pot is infinite, but only one bet is allowed. We found
intuitively what the optimal strategy and value of the game were. Then
we created a method using indifference equations to solve the same
problem in a little harder way, and got the same answers.

Some points that were raised that were not made explicit in Part 1:

We assume for the sake of these analyses that the distribution of hands
on the interval [0,1] is uniform. Later we will address games where one
or both players have distributions of hands that are not uniform (eg one
player's chance of getting a particular hand is that hand's value
squared).

When we calculate the value of the game or do any EV calculations for
infinite pot games we are not concerned with the money in the pot; only
in money generated by the betting and raising that takes place in our
game. Later, when we address finite pot games, we will consider the
money in the pot for purposes of fold/bluff frequency, but again, our
phrase "the value of the game" will consider only betting and raising
that takes place within the confines of our game.


In this part: Game #2 and the effect of raising. Player Y gets to win
money!

Of course, poker is not nearly as simple as our little toy game (Game
#1). Once we allow raising, we give Y a mechanism to punish X for
betting bad hands. Previously, he was forced to simply call and win one
bet. However, now he can win two bets from X when X bets bad hands.

Game #2: infinite pot, two bets allowed. Check-raise is not allowed.

OK, let's solve this one intuitively as well. Then we'll check our work
using the equilibrium equation method. From this point on, though, we
will start solving all the games using the equilibrium method, since the
intuitive answer can get a little complicated.

OK, start with X. X will bet some fraction of the time. Otherwise, X
will check. If X checks, then Y can either bet or check, same as last
time. However, if X bets, then Y can either raise or call.

Start with X betting.

Now, Y wants to raise with his best hands and just call with his worse
hands. He'll win an extra bet when he raises correctly, but lose an
extra bet if he raises incorrectly. Y will want to raise with half of
the hands that X would bet with, since he'll win an extra bet when X is
in the lower half, and break even when X is in the upper half. So we
know that whatever % of hands X bets with, Y will raise with half of
them. What about when X checks? Well, since there's no check-raising,
it's the same as last time; Y will bet the same hands as X will, plus
half the hands that he doesn't.

Now let's try to figure out what X's strategy will be. What if X bet all
his hands, as previously? Then Y would raise half the time and never
bet. X would lose a bet whenever he had a hand in the bottom half of
hands and Y had a hand in the top half. That's 1/4 of the time. What
about if X checked all the time? Then Y would bet half the time, and X
would lose a bet the same amount of the time. Can X do better? What if X
bet half the time? Then he would lose an extra bet to Y's raise whenever
Y had a hand from 0 to .25 and X had a hand from .25 to .5. And he would
lose a bet to Y's value bet when X had a hand from .75 to 1 and Y had a
hand from .5 to .75. But this is only a total of 1/16 + 1/16, or 1/8 of
the time. Try it and see if you can do better. (You can't.)

So X's optimal strategy is to bet half the time, and Y's optimal
strategy is to raise with the top quarter of his hand and call the rest
of the time, and to bet the top three-quarters of his hands when checked
to. This yields a 1/8 of a bet for player Y.

Make sure you understand the above before continuing; we're going to
solve this algebraically now.

OK, start with X. X will bet some fraction of the time; last time we
called it x1, so let's use that again. Otherwise, X will check. If X
checks, then Y can either bet or check, same as last time. However, if X
bets, then Y can either raise or call.

Y now has TWO cutoff points. Call these y1 and y2.

y1 = the cutoff at which Y will bet if X checks.
y2 = the cutoff at which Y will raise if X bets.

Of course, y2 is going to be lower than y1, because Y will obviously
raise less hands than he value bets.
x1
|----------|---------|----------|----------|
0 y2 y1 1

Now we need to write some indifference equations.

First of all, let's write the equation for y1. At y1, player X wants to
make player Y indifferent to betting or checking, just like last time.
And so this equation is actually the same as previously:

2y1 = x1 + 1 [1]
(this from part 1)

Next, let's write the cutoff for y2. Player X wants to make player Y
indifferent to raising or calling at y2.

The value of raising at y2:
X's hand Result
0->y2 -2
y2->x1 2
x1->1 X did not bet

The value of calling at y2:
X's hand Result
0->y2 -1
y2->x1 1
x1->1 X did not bet

EV(raise at y1) = EV(call at y1)
2*(x1-y2) - 2*(y2-0) = 1*(x1-y2) - y2
2x1 - 4y2 = x1 - 2y2
2y2 = x1 [2]

Okay, lastly we need to write the equation for x1. Player Y wants to
make it indifferent for X to check or bet at x1.

When X bets at x1, the following things happen:
Y's hand Result
0->y2 -2
y2->x1 -1
x1->1 1

When X checks at x1:
Y's hand Result
0->x1 -1
x1->y1 1
y1->1 0

So we want to set these things equal:

EV(bet at x1) = EV(check at x1)
-2y2 - (x1-y2) + (1-x1) = -x1 + (y1-x1)
-y2 - 2x1 + 1 = -2x1 + y1
y1 = 1-y2 [3]

Interesting. What this equation says is that the distance from y1 to 0
should be the same as the distance from y2 to 1.

In any event, we have three equations now:
2y1 = x1 + 1 [1]
2y2 = x1 [2]
y1 = 1-y2 [3]

Solving,

2-2y2 = x1 + 1
2-2y2 = 2y2 + 1
4y2 = 1

y2 = 1/4
x1 = 1/2
y1 = 3/4

Presto! Exactly what we arrived at intuitively or by trial and error.

The value of this game, of course, is the same as previously. Here's
another way to look at it:

X hand Y hand Result for X
0->y2 0->y2 0
y2->x1 1
x1->y1 1
y1->1 1
y2->x1 0->y2 -2
y2->x1 0
x1->y1 1
y1->1 1
x1->y1 0->y2 -1
y2->x1 -1
x1->y1 0
y1->1 0
y1->1 0->y2 -1
y2->x1 -1
x1->y1 -1
y1->1 0

Now, since it turns out that all these regions are 1/4 of a unit long,
we basically have that each entry in the "result for X" column is 1/16
of the expectation. Summing that column results in -2, which equates to
-2/16, or -1/8 of expectation for player X; this is the same we obtained
earlier.

Also, this section began the introduction of a notation that we will
continue to use throughout:
x1,x2,x3/y1,y2,y3

Generally, x_n or y_n will denote the threshold at which X or Y is
willing to put in the nth bet. As we expand to the infinite bet cases,
this notation will become very useful. Also note, for example, since
there is no way under the rules for X to put in the 2nd bet in this
game, there is no x2. However, when we solve the two-bet game with
check-raising, we will see the appearance of x2.

Next: Infinite bets, "r", and Game #3

T. Pascal

unread,
Jan 15, 2003, 2:54:01 PM1/15/03
to
Jerrod Ankenman <jerroda...@yahoo.com> wrote in message news:<3E24DE52...@yahoo.com>...

> The [0,1] game and Derivatives: A Study of Some Poker-Like Games
> Bill Chen/Jerrod Ankenman
> [snip]

>
> Presto! Exactly what we arrived at intuitively or by trial and error.
>
I did not see any mention of pocket fives in the proof. Did you
assume that X will only bet with pocket fives? Please enlighten us.
0 new messages