Several years ago, there was a backgammon program that defeated the
world champion in a exhibition match. This was described in Scientific
American and I have been told that the program was briefly sold packaged
on a PC for about $4000 around 1982. Of course, the general marketability
of a $4000 dedicated backgammon computer being what it is, the enterprise
quickly failed. (One could also argue that $4000 is better spent playing
against and learning from experts, but that's another story).
At any rate, the computer won, but an analysis of the games revealed that
8-9 mistakes were actually made by the program, and were glossed over
by the roll of the die. We all know this from club or tournament play;
there is no substitute for rolling well ;-) The difference between an
expert and a experience club player is perhaps 3% (feel free to argue this
one!)...well, to me that translates into 48.5% of the time the lesser
player will win. That's why experts like to play rounds of 11 or 17 points
and are very cautious (i.e. judicious) with the cube. The more games,
the greater their advantage over the lesser player. I believe the
exhibition match was to 9.
Personally, I can't imagine backgammon without gambling, and I
can't imagine gambling with a computer, so I also can't imagine
playing bg with a computer. To me, much of the challenge comes
from judging individual players and adjusting your play to take
advantage of those differentials between what's mathematically
correct and what would prove correct with that individual(s) in
that situation over a thousand trials.
For instance, as some people get down on the scoresheet near the
end of the session, not only will they take almost any cube,
they'll often beaver a bad cube. It's not that I'm just in it
for the money, but that much of the *excitement* comes from the
human aspects of the game. It's fairly easy to imagine that a
computer can calculate the best moves in non-contact situations,
and a computer could probably correctly calculate such things as
cube equity most of the time, but beating a computer with a
well-timed backgame as you will with someone who's completely de-
vastated when it turns out they have to leave a couple of double
shots when they'd thought they had the game all locked up
since they were already bearing off and you still had 4 men stuck
back on their 1 and 3 points!
: In article <1991Sep9.2...@reed.edu> juh...@bagpipe.Reed.Edu (Fritz Juhnke) writes:
: >I don't know if any of you also subscribe to rec.games.board, but there is an
: >interesting discussion about backgammon going on there. It has been claimed
: >that there are computer backgammon programs which are better than any humans.
: >Is this really the case? Can someone enlighten me on the state of computer
: Several years ago, there was a backgammon program that defeated the
: world champion in a exhibition match. This was described in Scientific
: American and I have been told that the program was briefly sold packaged
: on a PC for about $4000 around 1982. Of course, the general marketability
: of a $4000 dedicated backgammon computer being what it is, the enterprise
: quickly failed. (One could also argue that $4000 is better spent playing
: against and learning from experts, but that's another story).
: At any rate, the computer won, but an analysis of the games revealed that
: 8-9 mistakes were actually made by the program, and were glossed over
: by the roll of the die. We all know this from club or tournament play;
: there is no substitute for rolling well ;-) The difference between an
: expert and a experience club player is perhaps 3% (feel free to argue this
: one!)...well, to me that translates into 48.5% of the time the lesser
: player will win. That's why experts like to play rounds of 11 or 17 points
: and are very cautious (i.e. judicious) with the cube. The more games,
: the greater their advantage over the lesser player. I believe the
: exhibition match was to 9.
The match took place in '79, I believe, and I think it was a 7-point match.
A program developed by Hans Berliner of CMU defeated the winner of the
annual Monte Carlo tournament, Luigi Villa. (There's no "World Champion" in
backgammon in the same sense as there is at chess; the winner of the Monte
Carlo tournament is labeled the World Champion for some reason, despite the
lack of central organization and all the other trappings of the World Chess
Championship that backgammon lacks.) I remember seeing the complete
match score in some backgammon publication or other around '79-'80; the
machine didn't play as well as Villa did, but played pretty well nevertheless.
The statement that there are programs that play better than humans was
probably made by someone who heard something about the computer-vs.-Villa
match. It's false, anyway--the standard of play has improved considerably
since then, and there has been hardly any work on computer backgammon to
extend Berliner's very good work. A program called Expert Backgammon,
recently on the market and available through various backgammon organizations,
seems to play a halfway-decent-but-not-that-great game.
I think Berliner made the "8-9 mistakes" claim (or a similar one) in his
subsequent Scientific American article on his program, his methodology,
and the match. I'd bet the machine made more mistakes than that; I don't
think anyone but Berliner did the analysis you mention--at least, he didn't
say otherwise in his article. I don't think Berliner's a recognized expert;
I know that he claimed the machine made a brilliant move with a roll of 22
in the last game of the match, when in fact the move was a blunder. (The
machine won a hopeless race by throwing multiple sets of doubles; its 22
move facilitated racing. This caused some on-site expert commentators to
label the move a good one, and probably influenced Berliner's thinking about
it. I'll dig up the article and post my analysis of the position, if
anyone wants me to.)
Anyway, Berliner's article (June 1980 Scientific American, I think) is
well worth reading, especially for people wishing to develop strong
If I remember correctly, Berliner's old program BKG didn't
know how to double except in running games. I could be wrong.
BKG did not compete in any of the Computer Olympiads. In fact
BKG may not be runnable any more; I think it was written in
BLISS for the TOPS-20, and we don't have any of those at CMU
any more. Apologies to Dr. Berliner if I have this all wrong.
If anybody has a collection of master backgammon games on line,
it shouldn't be too hard to duplicate some of Tesauro's work.
(I know, the concept of duplicating experiments is foreign to
computer science.) Let me know if you have such a collection and
would like collaborate on this. We might need about 500 games.
One of the problems with trying to develop a world-champion
strength backgammon program is that it's hard to know which
of two good backgammon-playing entities is better. If neither makes
any sizeable mistakes, they might have to play thousands of times
before one can say with confidence which one is better.
-- Bert Enderton
>I don't know if any of you also subscribe to rec.games.board, but there is an
>interesting discussion about backgammon going on there. It has been claimed
>that there are computer backgammon programs which are better than any humans.
>Is this really the case? Can someone enlighten me on the state of computer
Ok, here's what "Inside Backgammon" has to say about
computer backgammmon. Reprinted without permission.
Typos are mine.
"The Gammon Press regularly benchmarks both portable (hand-held) backgammon
computers and backgammon software. Our procedure is to have a world-class
human player play a series of games (usually 100, but less in the case of
portable machines) against the computer program and record the results,
along with any informative comments on useful features, awkward features,
and program eccentricities, and any inability to follow the rules of
"The chief rating criteria is the program's expected loss per game. For
example, if the program lost 130 points against a human in 100 games, the
expected loss is -1.30 points per game. Obviously, the smaller the average
loss, the stronger the program. The results are summarized in the following
table under "Score".
Program Name Source Type Score Price
~~~~~~~~~~~~ ~~~~~~ ~~~~ ~~~~~ ~~~~~
Championship BG Spinnaker IBM-PC -0.66 $35
Expert Backgammon Komodo Macintosh -0.82 $70
Backgammon Odesta IBM-PC -1.20 $50
Video Gammon Baudville IBM-PC -1.61 $40
Sensory BG Scitek Portable -1.64 $129
Gammon Gakken Portable -12.40 $90
Gammon Pal Fidelity Portable -15.63 $80
Micro BG Fidelity Portable -15.63 $60
"As the table shows, Championship Backgammon represents the state of
the art in backgammon software, and it is, therefore, the only program
we stock. Championship Backgammon was co-authored by Craig Chellstrop,
a world-class player in his own right and winner of the 1989 Reno
tournament. The program can be set for both money or match play, is
easy to use, and plays all types of positions at least plausibly well.
Expert backgammon scored -0.82 on our computer rankings, making it the
second-best program we have tested. It has the best user interface of
any program and a host of special features including a rollout
capability. The program plays the opening and middle game quite well,
but weakens in the endgame. Available only in a Macintosh version; an
IBM-PC version is under development and is expected in 1992.
"The two hand-held computers in our survey, Sensory Backgammon and
Gammon, are both marketed through the Sharper Image. As the table
shows, neither is a good buy, particularly in view of the prices.
Gammon contains a serious flaw which causes it to double when ahead
in the race, even with checkers trapped behind a full 6-prime! This
results in the loss of some very large cubes. Sensory Backgammon by
Scitek is a more competent effort. Both hand-held machines suffer from
an awkward user interface. When entering moves, it is necessary to
press down on keys representing the "from" and "to" points for each
play. Even a relatively routine game can take 15-20 minutes to play.
"Backgammon by Odesta is a competent product with a nice display. It
contains one excellent feature: a keystroke which tells the program to
finish the game with no further input. This greatly speeds up the play
of boring no-contact positions. However, the program's author was a
programmer who learned how to play for the purpose of writing the
program. His lack of experience shows in his evaluation of the books
he read: he praises 'Backgammon for Blood' as a 'macho, aggressive
text for the high stakes player', while dismissing Magriel's book as
a book for someone who merely wants 'an impressive-looking backgammon
book'. [!!] Not surprisingly, the prgram plays better on its weakest
setting (when it tries to play a racing game) than on its strongest
setting (when it incorporates some Backgammon_for_Blood-like strategy.
"Video Gammon handles the cube too aggressively to score well, although
it has the best user input function.
"Two new portable computers were reviewed in 1989, Gammon Pal and Micro
BG, both from Fidelity, a top manufacturer of computer chess programs.
As the table shows, these programs are the weakest yet. In fact, the
programs are somewhat worse than the results indicate: a bug in the
software does not permit the doubling cube to rise beyond 64, and about
20% of the program's losses occurred at that level.
QUIZMASTER by Walter Trice
"Quizmaster provides what you need in a computer bearoff program: a
teaching environment to help you perfect your cube play in over-the-
board bearoffs. Suppose, for instance, you've just lost a 16-cube by
making a hasty double in a 3-checker-vs-3-checker bearoff. With
Quizmaster on your IBM-PC, you can instruct the computer to feed you
positions of this type until your mastery level is 100%. In addition,
Quizmaster contains an inquiry feature which allows you to see the
proper cube action for any bearoff position.
"Quizmaster comes complete with a basic library of 216,000 positions
with 5 checkers or less on each side. A position generator lets you
expand this database, limited only by the hard disk of your PC. Quiz-
master is a must for the serious student. Requires IBM-PC or compatible
with hard disk. Costs $100.
BEAROFF 1 & BEARSEARCH by Dean Muench
"Enter money backgammon bearoff positions into your IBM/PC or compatible
computer. Bearoff 1 then computes the following information for the side
1. Expectancy in points times cube value.
2. Probability of winning.
3. Is it a double ? A take ?
4. The best piece play for each dice roll.
"Bearsearch selects and summarizes a subset of bearoff positions based on
any combination of 20 possible parameters. A fine addition to your Bearoff-1
analysis package. Costs: Bearoff-1 - $70. Bearsearch - $25."
There you have it. I intend asking Bill's permission to reprint
some of his articles, analyses and product details, and I'm also
going to try to get him on the net, so we can all benefit from
the input and advice of the world's highest ranking player.