Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

doubling theory

2 views
Skip to first unread message

Michael J Zehr

unread,
Jun 10, 1992, 9:49:14 PM6/10/92
to
In article <1992Jun5.1...@math.utexas.edu> eps...@rene.ma.utexas.edu (Paul Epstein) writes:
>As a mathematics student and a relative newcomer to the game, I
>would be interested to know how decisions about doubling are affected
>by the scores in a match.
>Suppose [match to 21, leader holding 4-cube at 15-8, no gammons possible.]
>
>How high does A's probability of winning need to be before A should decide
>to double?
>How high does B's probability of winning need to be before B should accept
>a double offered by A?

[Although I'm sure a lot of you know how to solve this, I'll post a
detailed description of how to solve such problems so that those of you
who don't can see how it's done. Please forgive any verbosity...]

You can't really answer this question without a feel for a player's
chance of winning at any given score. For this you use a match equity
table, which lists winning chances at a certain numbers of games away
from match for each player. For example:


Here's the one in _World Class Backgammon, Move by Move_ by Roy Friedman.

1: 2: 3: 4: 5: 6: 7: 8: 9:
1: 50 66 75 80 83 87 89 92 94
2: 34 50 59 64 72 77 82 85 88
3: 25 41 50 56 63 69 74 78 82
4: 20 36 44 50 57 63 68 72 77
5: 17 28 37 43 50 56 62 66 71
6: 13 23 31 37 44 50 56 60 65
7: 10 18 26 32 38 44 50 55 60
8: 8 15 22 28 34 40 45 50 55
9: 6 12 18 23 29 35 40 45 50

The numbers on the left and top are number of points needed by each
player, and the values in the chart are percentage of match wins.

For those of you unfamiliar with such a table, keep in mind that leading
3-2 in a match to 5 gives you the same chance to win as leading 5-4 in a
match to 7. So a position is usually referred to as x-away, y-away.

Back to our problem. A is leading 6-away, 13-away. This table doesn't
go that high, so there are two possibilities: gut instinct or another
table. *grin*

My estimate at 6-away 13-away says something like 85%. Kit Woolsey's
new table says 82. He's put a lot of research into it so let's believe
him. Now as a matter of fact, the match equity at the current score
doesn't mean much. What we really want is to look at possible outcomes
of the match. 2-away 13-away gives a match equity of 96. 6-away 9-away
gives an equity of 67. (The table above says 65, but lets use all Kit's
figures.)

So A's current chances are 96p + 67(1-p) or 67 + 29p, where p is the
chance of winning this game.

A should only double if A's match equity goes up after a double and a
take. (We've already ruled out gammons in our assumptions above.)

Now what if there's an 8-cube? The score will be a win for A or 6-away
5-away, which is 43% for A. So A's chances have gone to 100p + 43(1-p)
or 43 + 57p.

Where is the break even point? When the two values are equal. So we have:
67+29p = 43+57p
24 = 28p
24/28 = p

or about 86%! So A has to have a very big lead in the game to double at
this score with the cube at 4.

But, there's another factor we need to consider -- should B take, and if
so what should B do? If B takes, then if A wins the game, A wins the
match, but if B wins the game, B is only about even. Since a win for A
already wins the match, B loses nothing by doubling. So B should double
back immediately! Thus whoever wins the game wins the match.

After a redouble by B, A's chances are 100p. So we plug that in:

67+29p = 100p
67 = 71 p
67/71 = p = 94%

So A has to have a 94% or greater chance of winning to gain by doubling!

Now there's another question though, should B accept? If B drops, B's
match equity is 4. After a take and redouble, B's chance is 100-100p.
Let's see where these are equal:

4 = 100 - 100p
96 = p

If you recall that p always has stood for A's chance of winning, B
should acept with a 4% chance or greater, of winning...

So... A should double with a chance of 94% or better, B should accept if
A's chances are below 96%, and drop if A's chances are above 96%.

This is one of those cases when A can't use all of the value of the cube
by doubling. (A needs 6 to win, turning the cube gives B the chance to
win at leat 8, and in fact 16!) Whenever you're in this situation, you
have to be very very slow to double!

Hope this helps (and is mostly correct!)

-michael j zehr

Paul Epstein

unread,
Jun 5, 1992, 11:59:11 AM6/5/92
to
As a mathematics student and a relative newcomer to the game, I
would be interested to know how decisions about doubling are affected
by the scores in a match.
It strikes me as strange that people sometimes give opinions about
doubling without stating the match-score. Perhaps in these instances,
the intended scenario is that of gambling for money.
In that scenario, it is clear that in a position where no gammons are
possible and where player B could never get any advantage by redoubling,
player B should accept A's double if and only if B's chances of winning
are at least 25%.
I would like to pose the following problem as a random example.
Suppose the object of the match is to reach 21. Suppose A is leading B
15-8 with the cube set at 4. A possesses the cube.
Suppose no gammons are possible and that B could never get any advantage
by redoubling:

How high does A's probability of winning need to be before A should decide
to double?
How high does B's probability of winning need to be before B should accept
a double offered by A?

eps...@math.utexas.edu

0 new messages