First of all suppose that there is no market loser possibility
before one's first chance to turn cube.
In other words, it is impossible to have winning chances
less than X or greater than 1-X when you first have
option to turn cube, where X:= equity when at 2 away 1 away crawford.
(It is generally accepted that X is approximately 30%.)
This is virtually impossible to prove rigorously, but i believe
noone would argue that it is false. If we accept this as a fact
then the following proof works:
By symmetry, one's chance of winning the match assuming perfect play is 50%.
Consider the strategy "double at first opportunity and play checkers optimally
for a 1 point match". Since by our above "fact" the double will always be
accepted, the winner of match is winner of game. Probability of winning the
game is 50%. Thus our strategy attains the equity 50% and is a version
of optimal play.
QED
Note that there are other verions of optimal play. The theorem
"optimal play never requires a double if no market losing sequence" together
with the above theorem implies that "double if and only if
there is a market losing sequence" is also a version of optimal play.
Robert Koca (bobk on FIBS)
ko...@orie.cornell.edu
p.s. Noone has responded yet to my request for a rigorous
proof that it is optimal to use a loose 1 to bear off if possible in
contactless positions.
-------------------------------------------------------------------------
To find out more about the anon service, send mail to he...@anon.penet.fi.
Due to the double-blind, any mail replies to this message will be anonymized,
and an anonymous id will be allocated automatically. You have been warned.
Please report any problems, inappropriate use etc. to ad...@anon.penet.fi.
1. backgammon - magriel (the best 1st solid book, where can I get it)
or backgammon handbook (read if can't find the bible)
2. world class bg- Friedman
3. Genuk vs dwek
Advanced bg Robertie( read all of them)
Reno 86
4. Bg with the champions ----- Goulding
This will take me a year, but do you agree with this, where can I get these
books
Thanks
Tom
>
>I would like some helpful advice on what bg books I should read and. . .
>1. backgammon - magriel (the best 1st solid book, where can I get it)
> or backgammon handbook (read if can't find the bible)
>
>2. world class bg- Friedman
>
>3. Genuk vs dwek
> Advanced bg Robertie( read all of them)
> Reno 86
>
>4. Bg with the champions ----- Goulding
These look like good books for you to be learning from at this point.
Magriel's book is hard but not impossible to find these days. Much depends
on where you live. Try your local used book stores, where if you find one
it may cost as little as $10. I've only ever found one in Boston over the
past few years. Otherwise, you'll have to pay much more from someone who
traffics in used BG books, and it will cost somewhere between $70-$100,
probably. Perhaps, though, some player will be willing to part with it more
cheaply! (There are libraries, too. . .).
I've enjoyed buying occasional items from Carol Joy Cole, who runs the
_Flint Area Backgammon News_. Contact her and ask for her used/rare book
list.
Carol Joy Cole
3003 Ridgecliffe Drive
Flint, MI 48532-3730
Phone/Fax: 810/232-9731
Carol Joy's newsletter has a full page menu of books & equipment every other
issue. She does carry the Kleinman books and some of the Robertie books,
too, so she may be your "all-purpose" merchant.
You can get the Robertie material and a nice catalog of various books and
equipment from Robertie himself at the following address:
The Gammon Press
P.O. Box 294
Arlington, MA 02174
tel. 617-641-2091
Fax: 617-641-2660
Whether or not it would be worth your money buying Magriel may depend on the
strength of your game *and* your ability to explain the reasons why you
make the moves you do. I played for many years before "studying" the game,
and worked up a pretty good understanding of concepts like "timing" without
ever being able to articulAte them clearly. When I finally read Magriel I
didn't learn many shockingly new tactics, but I did consolidate the things
I had learnd up to that point. So for me, that book didn't immediately
change my play, but it did lay the bedrock for my future studies. I'm sure
I've won much more than $100 thanks to that book, and I've known players who
could have saved themselves a lot of expensive "practical learning" by just
shelling out and studying it to begin with.
Hope this info. is useful to you.
Albert
P.S. --There may well be other sources for these materials out there. These
are the sources that I've used, though. I hope other people will post
further on this.
--
"When it was proclaimed that the Library contained all books,the
first impression was one of extravagant happiness. All men felt
themselves to be the masters of an intact and secret treasure.
-Jorge Luis Borges, "The Library of Babel"
Hmmmm. Suppose no one doubles for awhile, since the game
seems about even.
Later, my opponent has a miracle roll, or maybe I roll something bad.
Suppose my opponent suddenly has more than 70% advantage; I can drop a
cube, and have a chance to win the match. If, on the otherhand, the
cube was already at 2, then I might feel quite stupid for having doubled
early. I don't like feeling stupid. (I wonder if that measures in?)
Or -- the opposite happens and I can unexpectedly and suddenly
double my opponent out of the game. I now have 70% chance for the match.
I am no expert, but I tend to feel something can be gained by waiting so
a more informed decision can be made, rather than immediately basing the
outcome of the match on one game.
On a different subject, to disagree with earlier posts, I want to say
that "Paradoxes and Probabilities" by Barklay Cooke was worthwhile and
enjoyable for me about 15 years ago. It opened my eyes to different
aspects of evaluating plays and board positions. Later, I got stuff by
Magriel and Robertie which improved those skills. My first book,
Backgammon by Oswald Jacoby, was a worthwhile beginner book.
----- Robert D. Johnson rjoh...@cvbnet.cv.com "rjohnson" on FIBS
Scenario 1:
|
|Later, my opponent has a miracle roll, or maybe I roll something bad.
|Suppose my opponent suddenly has more than 70% advantage; I can drop a
|cube, and have a chance to win the match. If, on the otherhand, the
|cube was already at 2, then I might feel quite stupid for having doubled
|early. I don't like feeling stupid. (I wonder if that measures in?)
Ok
Scenario 2:
|
|Or -- the opposite happens and I can unexpectedly and suddenly
|double my opponent out of the game. I now have 70% chance for the match.
So - you would wish you had doubled early then! Then you would have more
than 70% chances.
These 2 scenarios balance each other out.
|
|I am no expert, but I tend to feel something can be gained by waiting so
|a more informed decision can be made, rather than immediately basing the
|outcome of the match on one game.
If, before the game starts, I promise you that I will always double at my
first opportunity, then clearly my match winning chances are exactly 50%,
assuming that we play equally well.
If you play perfect backgammon, then clearly 50% is the best that I can
hope to do against you (!)
So I can do no worse by doubling on the first move, then if I had decided
to use some different doubling strategy.
To put it better: one should always(!) double on the first move of the
2-away/2-away game, unless either (1) he feels his opponent is more
likely to make a doubling-error later in the game, or (2) his pride won't
let him double, or (3) a combination of (1) and (2).
|
|On a different subject, to disagree with earlier posts, I want to say
|that "Paradoxes and Probabilities" by Barklay Cooke was worthwhile and
|enjoyable for me about 15 years ago. It opened my eyes to different
|aspects of evaluating plays and board positions. Later, I got stuff by
|Magriel and Robertie which improved those skills. My first book,
|Backgammon by Oswald Jacoby, was a worthwhile beginner book.
|
|----- Robert D. Johnson rjoh...@cvbnet.cv.com "rjohnson" on FIBS
Btw, your post just reached my newsfeed (agate.berkeley.edu) today after
4 days... I wonder what the delay was..
Chris
If you're behind, you might as well not double, in case your opponent
forgets to. If you're ahead, even if you have a market losing sequence,
it might be worthwhile not doubling to give your opponent a chance to
make a mistake later.
-michael j zehr
If you are arguing that the strategy "double at the first opportunity"
does not yield exactly 50% match winning chances, then I disagree and
argue as follows:
Let's suppose that with a dead-cube, we both play perfectly. Let's also
suppose that you are a perfect player - you not only play your checkers
perfectly, but you also handle the cube perfectly.
Since my cubeless chance of winning the next game, assuming it is played
to completion, is exactly 50%, then
if I can develop a doubling strategy that will always guarantee the
current game to be played to completion, with the cube ending at the 2-level,
then my match winning chances exactly equal my chances of winning one
game, played to completion, i.e. exactly 50%.
Any of the following (these are only examples - you can come up with
others of course) doubling strategies satisfy this condition:
1. Double at the first opportunity
2. Double iff I have a market loser [note: if you don't adopt this same
strategy, then I will have > 50% match chances; however, we've assumed
that you are a perfect player]
3. Double iff either (1) My cubeless chances are less than 50% or (2) If
I have a market loser
4. Double iff either (1) My cubeless chances are between 30% and 40% or (2) If
I have a market loser [30% is the approximate take/drop pt.]
5. Double before any moves have been played (if you allow this breach of
standard rules)
Suppose that before the 2-away/2-away game begins, I select one of the 5
doubling strategies above. Then I ask an impartial observer what my
match winning chances are. In all cases, the observer will have to
reason as follows: "This is the last game of the match, and it will be
played to completion. I know that Chris can win 50% of the time in a "1
pt. match," therefore, Chris must have exactly 50% match winning chances."
It does not matter where the double comes -- as long as I use a doubling
strategy that guarantees the game to be played to completion, with the
cube ending on 2, then I have reduced the match to exactly the next game,
which I have a 50% chance of winning.
Of course my actual winning chance will vary from move to move, and if when
I finally double, my winning chance is less than 50%, then at this point my
match winning chance will also be less than 50%. I'm not claiming my
match winning chance is 50% at the point of my double, I merely claim
someone who uses one of the 5 above doubling strategies will have exactly
50% match winning chances at the BEGINNING of the game, before any dice
have been rolled. Thus, if I can play a 1 pt. match perfectly, then
using any of the above strategies will give me exactly 50% winning
chances against a completely (checker & cube) player (God).
I hope this is convincing.
Chris
Some quick error corrections:
>2. Double iff I have a market loser [note: if you don't adopt this same
>strategy, then I will have > 50% match chances; however, we've assumed
>that you are a perfect player]
"iff" is too strong within my "note," I mean to say:
"2. Double iff I have a market loser [note: if you don't adopt a similar
strategy of at least doubling if you have a market loser (unless your
market losing sequence results in all of your wins being gammons), then I
will have > 50% match chances; however we've assumed that you are a perfect
player]"
(if some of the statements above seem difficult to justify, then think
about them for awhile)
>chances against a completely (checker & cube) player (God).
should read: "... a completely perfect (checker & cube) player..."
>Michael Zehr wrote:
>The problem with the proof "if I always double at first opportunity then
>I'm 50% because the game starts even" is that you *can't* double when the
>game is even (before either player rolls opening die) and afterwards
>it's no longer a 50% game.
>"If you're behind, you might as well not double, in case your opponent
>forgets to. If you're ahead, even if you have a market losing sequence,
>it might be worthwhile not doubling to give your opponent a chance to
>make a mistake later."
----
If my opponent is perfect player then my match equity is lower than or
equal to 50%. The strategy "double at first chance and play optimally
for a 1 point match" attains this bound. End of proof.
At the time of the double, of course the game is not exactly 50% anymore.
However the average over the doubles does equal 50%.
I stated that my model considered optimal play from the opponent.
If your opponent plays suboptimally you can maybe do better by waiting
as Michaelz suggests. Of course you might be the one who goofs up.
For practical play, I would suggest double immediately when playing a
opponent whose lessons cost more than yours. Another consideration
not included in my model is mental fatigue in a tournament. If
you have more matches that day and are playing a strong opponent,
then this would argue for doubling immediately.
,Bob Koca (bobk on FIBS)
ko...@orie.cornell.edu
The only point under dispute here is whether to automatically double even
in a losing position.
The two possible strategies are (1): automatically double even in a
mildly losing position; (2) Double whenever a market-losing roll is
available (even in a losing position).
For what it's worth, I have verified by computer that (1) is the correct
strategy for non-contact race positions: 6-6/small roll is a market loser
even in long race positions. The double point for a 120 point no-contact
racing positions is double when probability of winning is greater than 37%
for a 120 pip race (about -10 pipcount), but it does assume that the
opponent will follow correct strategy.
There are three issues here. (1) the probability of winning the game is
*not* 50% at the first opportunity to roll; (2) the opponent may *not*
follow optimal doubling strategy; (3) the automatic double strategy may
produce a locally optimal rather than a globally optimal probability of
winning (ie the strategy is better than strategies *similar* to it, but
not to significantly different strategies).
I think its probably a good idea to provide a concrete example for analysis:
Consider this scenario: playing at -2,-2 match point; white rolls 3-1;
black cube action? I'm not certain of the exact value of a 3-1 roll, but
lets say white is now roughly 55%+ to win the game but definitely
significanly less than 70%, the undisputed drop point for an early double
in this position. After a double, match equity is 55%. If black does not
double, and is still trailing after his next roll, white should almost
certainly double since rolls like 4-4 are likely to be market losers.
So. Black doesn't double, and rolls 3-3 playing 8-5(2) 1-4(2) or 8-5(2)
6-3(2). Now black has an advantage. Lets say about 55% again.
The situation then to consider is the sequence where black and white
alternately roll super rolls from disadvantaged positions, eventualy
leading to a position where market losing roll is available. Assume the
correct strategy is double only when a market losing roll is available
(even from a losing position). Assume that the probability of a swing roll
(a roll which moves the player from a disadvantaged position to an
advantaged position) is somewhere in the neighborhood of 5% for
convenience (1 roll in 36 is too few, 2 rolls in 36 is perhaps a little
high). Assume white's opening roll gives a 2.5% advantage and that the
swing rolls will add 5% to the rollers probability of winning. Further
assume that failure to roll a swing roll results in a double by the
oponent. We can now estimate black's match equity. Now, finally, assume
that all these assumptions bear some resemblance to reality in at least
one real-world situation ;->.
Let p be the probability of winning the game, P(U) be the probability of
winning the match without doubling, P(D) be the probability of winning the
match after black doubles. Is P(U) greater than P(D)?
P(D) is clearly
equal to p, which is 47.5% in this scenario. Consider the event W (for
wipeout) where black does not roll a swing roll and white doubles next
turn. Since there are no market losers, P(D)|W == P(U)|W (P's
conditional on the event W occuring) since black must accept, and the cube
ends up at the same point in both scenarios. Now, consider the event
where black rolls a swing roll leaving white in essentially the same
position that black is in, and presumably, the same cube action (No
double). P(D)|S (the probability of black winning the match after doubling
and rolling a swing roll) is thus 52.5%. P(U)|S is a little harder to
calculate, but can be established from a recurrence relation. Assume for
convenience that /W == S (all rolls which are not W are S). Let p(S) be
the probability of rolling a swing roll...
then (1)
P(U)|S = P'(U) = 1-P(U);
since rolling a swing roll leaves white in essentially the same position
as black is currently in,
and (2):
P(U) = p(S)*P(U)|S + p(W)*P(U)|W --(W and S and disjoint)
= p(S)*P(U)|S+(1-p(S))*P(U)|W
= p(S)*p(U)|S +(1-p(S))*P(D)|W
Since W and S are complements by assumption.
Solving (1) and (2):
P(U) = p(S)*(1-P(U))+(1-p(S))*P(D)|W. (3)
Now,
P(D) = p(S)*P(D)|S + (1-p(S)*P(D)|W (4) -- W and S are disjoint.
and eliminating the term (1-p(s))*P(D)|W from (3) and (4)
P(D)-p(S)*P(D)|S = P(U)-p(S)*(1-P(U))
= (1+p(S))*P(U)-p(S)
P(U) = (P(D)-p(s)*P(D)|S+p(S))/(1+p(S))
substituting P(D)|S = 1-P(D) yeilds:
P(U) = [P(D)-p(s)*(1-P(D))+p(S)]/(1+p(S))
P(U) = P(D) !
Critical assumptions to examine are:
P(D)|S = (1-P(D)).
Therefore, this *suggests*, very loosely that at least both strategies
are fairly closely equal!
Equations are unchecked. I need to think about this some more.
I don't see any mention of whether the actual double comes at > 50% or <
50% in the argument above, so I don't see where there's a dispute.
I think the argument is simple: if 1 person chooses to use the strategy
"double at 1st opportunity," then the winner of the match is precisely
the winner of the next game. The match has been reduced to "next game
wins." Thus, 50% match chance. If you still can't see this, then
consider playing the same game with someone who actually believes the
match to be -1:-1, and some spectator walking by the board on the 1st
move and burshing the cube up to 2.
>
>The two possible strategies are (1): automatically double even in a
>mildly losing position; (2) Double whenever a market-losing roll is
>available (even in a losing position).
I guess this means: the 2 strategies that you're going to consider below.
Also, from reading your example below, you've made the further assumption
that a player who is winning (say 52.5% w.c.) always has a market loser.
For the sake of argument, ok.
>
>For what it's worth, I have verified by computer that (1) is the correct
>strategy for non-contact race positions: 6-6/small roll is a market loser
>even in long race positions. The double point for a 120 point no-contact
>racing positions is double when probability of winning is greater than 37%
>for a 120 pip race (about -10 pipcount), but it does assume that the
>opponent will follow correct strategy.
>
>There are three issues here. (1) the probability of winning the game is
>*not* 50% at the first opportunity to roll; (2) the opponent may *not*
>follow optimal doubling strategy;
Since we want to determine optimal play, we need to consider that our
opponent puts up the best fight possible, i.e. plays optimally himself.
>(3) the automatic double strategy may
>produce a locally optimal rather than a globally optimal probability of
>winning (ie the strategy is better than strategies *similar* to it, but
>not to significantly different strategies).
I don't see how this can possibly be the case! Backgammon is a game of
perfect information -- the current position is always known to both
players, and there is no hidden information, among other things. Chess
and checkers have this similar property, while games such as poker,
bridge, and rock-scissors-paper do not.
A perfect player is guaranteed to always have at least 50% match winning
chances, even if he announces his strategy in advance and is forced to
follow it.
>
>I think its probably a good idea to provide a concrete example for analysis:
>
>Consider this scenario: playing at -2,-2 match point; white rolls 3-1;
>black cube action? I'm not certain of the exact value of a 3-1 roll, but
>lets say white is now roughly 55%+ to win the game but definitely
>significanly less than 70%, the undisputed drop point for an early double
>in this position. After a double, match equity is 55%. If black does not
>double, and is still trailing after his next roll, white should almost
>certainly double since rolls like 4-4 are likely to be market losers.
>
>So. Black doesn't double, and rolls 3-3 playing 8-5(2) 1-4(2) or 8-5(2)
>6-3(2). Now black has an advantage. Lets say about 55% again.
>
>The situation then to consider is the sequence where black and white
>alternately roll super rolls from disadvantaged positions, eventualy
>leading to a position where market losing roll is available. Assume the
>correct strategy is double only when a market losing roll is available
>(even from a losing position). Assume that the probability of a swing roll
>(a roll which moves the player from a disadvantaged position to an
>advantaged position) is somewhere in the neighborhood of 5% for
>convenience (1 roll in 36 is too few, 2 rolls in 36 is perhaps a little
>high). Assume white's opening roll gives a 2.5% advantage and that the
>swing rolls will add 5% to the rollers probability of winning. Further
>assume that failure to roll a swing roll results in a double by the
>oponent. We can now estimate black's match equity. Now, finally, assume
>that all these assumptions bear some resemblance to reality in at least
>one real-world situation ;->.
A bit of an artificial example, but ok... Maybe you want to change 55% to
to 52.5% if I understand your example correctly. Also, if you are
considering the specific case where white and black alternately exchange
"super rolls," until one fails to roll the super roll (at which pt. the
opponent doubles), then I'm not sure what relevance the actual prob. of
them having rolled these super rolls was, but again that's not impt.
>
>Let p be the probability of winning the game, P(U) be the probability of
>winning the match without doubling,
I guess you mean without doubling on *this* roll.
>P(D) be the probability of winning the
>match after black doubles. Is P(U) greater than P(D)?
>
>P(D) is clearly
>equal to p, which is 47.5% in this scenario. Consider the event W (for
>wipeout) where black does not roll a swing roll and white doubles next
>turn. Since there are no market losers, P(D)|W == P(U)|W (P's
>conditional on the event W occuring) since black must accept, and the cube
>ends up at the same point in both scenarios. Now, consider the event
>where black rolls a swing roll leaving white in essentially the same
>position that black is in, and presumably, the same cube action (No
>double). P(D)|S (the probability of black winning the match after doubling
>and rolling a swing roll) is thus 52.5%. P(U)|S is a little harder to
>calculate, but can be established from a recurrence relation. Assume for
>convenience that /W == S (all rolls which are not W are S). Let p(S) be
>the probability of rolling a swing roll...
No need to bring p(S) into the picture because you are restricting
yourself to only the cases where S actually happens.
You are trying to overanalyze this I think... Why not just
assume nothing about white's strategy and just try to derive black's
ideal strategy? As I've mentioned above, if black is able to derive an
ideal strategy, he will not lose any equity if his strategy becomes known
to white.
Having said this, P(U)|S <= 52.5% because white always has the option to
double on his turn if it is in his best interests. P(U)|S >= 52.5%
because it is now black's turn again and he has 52.5% w.c., so he will
double if it is in his best interests. So P(U)|S = 52.5% = P(D)|S.
So black had an optional double.
Sorry, but I have no idea whether your equations are right or not - (I'm too
tired...) Also, this is a very peculiar example where white and black
alternate being a 52.5% favorite up to some point. You've used the fact
that after a swing roll by white (for example), white has the exact same
prob. of winning as black did the roll before after black had rolled his
swing roll. It seems like your equations, as nice-looking as they may be,
don't generalize well.
Instead of dozens of calculations, let's try to use a little logic to try
to determine a player's optimal doubling strategy:
Suppose A is playing B in a -2:-2 match. Suppose A is on roll:
Definitions:
1. A market losing sequence (or a market loser) is a 2 roll (2-ply) dice
sequence (A rolls, and then B rolls) such that when it's A's turn again,
B no longer has a takeable position.
2. A passing roll is a roll (by A) such that now A's position is
untakeable, should B double next.
Now, let's look at the 2 main cases that have received so much "controversy":
Suppose A currently does not have a cash:
1. A has a market loser, but no passing rolls
Now, let's look ahead:
If A does NOT roll a market loser, his equity had he not doubled, would
not be any higher than if he had given the cube away, since B always has
the option on his turn to double if it is to B's advantage (and A must
accept). Clearly his equity can not be any lower, because he always has
the option to double now (and B must accept). So his equity must be
exactly the same, whether he gave the cube away or not.
If A does roll a market loser, his equity had he doubled would be p
(p > 70%), his current game winning chance. Had he not doubled, then now
he must either double-out and accept 70% equity, (i.e. he had better
equity if he had doubled)
or continue to play for
the gammon. If he continues to play for the gammon, then there are 4
possible results:
(a) He wins a gammon
End result: same equity whether he had doubled (at the market loser
possibility above) or whether he had chosen not to double at that point.
(b) He is forced to double-out later, when going for the gammon is no
longer profitable, or goes on to only win a single game
He would have been better off had he doubled
(c) He later doubles B back into the game, after falling below 70%
No harm done... equity is the same of course.
(d) He loses
Equity in either case the same!
So, his equity, had he chosen not to double, is <=p, (p = equity had he
doubled) with equality iff (b) never happens.
Putting it all together,
A has a mandatory double, unless all of his market losing sequences
result in all of the corresponding wins being gammons, in which case the
double is optional (i.e. if he never has to settle for only a single win
after rolling one of his market losers). I actually believe that no position
satisfies all of (1) market losers, (2) no passing rolls, (3) no single
wins after a market loser occurs, but this isn't really impt.
So with a rare exception (if it can even happen), A has a mandatory double.
2. A has no market losers, no passing rolls
Let p = A's game winning chances
Claim: A's match equity is p whether or not he doubles now or holds the cube.
If A doubles, B accepts and A's match equity is p.
If A doesn't double, A's equity is <= p since B always has the option of
doubling himself next turn. A's equity is >= p since A has no market
losers and can always double on the next turn if it's in his best interests.
Therefore, A's equity is again p.
Therefore, A has an optional double.
Corollary: doubling at the 1st opportunity is optional.
The strategy described above is the ideal play for the 2 cases
considered. If A does not follow the ideal way of playing, he will
suffer (< 50% equity) playing against a perfect player. If A plays
ideally, he will always achieve at least 50% equity, no matter what
strategy B uses.
Chris