1 view

Skip to first unread message

Jul 13, 2001, 4:28:43â€¯AM7/13/01

to

Reading about the -2 -2 discussions in the archive I find one point

which is wrong in my opinion. Arguments for delaying the double:

> . There are 3 ways to gain equity from

not doubling:

> (1) Opponent loses his market;

> (2) he takes a drop;

> (3) he drops a take.

which is wrong in my opinion. Arguments for delaying the double:

> . There are 3 ways to gain equity from

not doubling:

> (1) Opponent loses his market;

> (2) he takes a drop;

> (3) he drops a take.

While I agree with 1 and 3 I don't think you are better of then

doubling early in 2. You will have reached the same position with the

cube already turned, so the only equity the wrong take gives back is

the one you lost by "losing your market".

If this hold's a correct "game theoretical" approach for the taker at

2 away 2 away is "alway's take" and you will never lose according to

(3).

Jul 13, 2001, 7:40:40â€¯AM7/13/01

to

"Always take" is not the correct action all the time at 2-away, 2-away. If

my opponent doubles when my winning chances drop below 30%, or according to

Snowie, 31.5%, then drop is the correct action.

my opponent doubles when my winning chances drop below 30%, or according to

Snowie, 31.5%, then drop is the correct action.

However, "double at your first opportunity" can never be considered an

error. My approach is to double when I am most any kind of favorite, but if

I'm an underdog, I usually leave the cube alone hoping my opponent WILL lose

his market and I can drop if he shoots by 70%.

--

Gregg Cattanach

Zox at GamesGrid, Zone

http://gateway.to/backgammon

gcattana...@prodigy.net

"neghe Onegu" <artur...@yahoo.de> wrote in message

news:58b1388.01071...@posting.google.com...

Jul 13, 2001, 5:47:49â€¯PM7/13/01

to

neghe Onegu wrote:

>

> Reading about the -2 -2 discussions in the archive I find one point

> which is wrong in my opinion.

>

> Reading about the -2 -2 discussions in the archive I find one point

> which is wrong in my opinion.

Only one?

There's a lot in those -2/-2 arguments that leave me asking, "But, but,

but, how does that follow?" From the reductionist view, the criterion

for doubling is exactly the same as at other scores -- you want to

maximize your match-winning probability. It seems that at -2/-2, the

method for doing that optimization should be real simple, but I don't

think it actually is. There are still an enormous number of

imponderables. However, it is hard to argue that one should not have a

hair trigger on the cube at that score.

Jive

Jul 14, 2001, 2:22:13â€¯AM7/14/01

to

"Gregg Cattanach" <gcattana...@prodigy.net> wrote in message news:<Y6B37.191$c41.52...@newssvr15.news.prodigy.com>...

> "Always take" is not the correct action all the time at 2-away, 2-away. If

> my opponent doubles when my winning chances drop below 30%, or according to

> Snowie, 31.5%, then drop is the correct action.

>

Gregg you are of course right. But that wasn't my point. If you

alway's take you can't lose more then opponent has already given away

by not doubling in time. So whatever he is trying to do to you he does

not better then cubing at his first opportunity. This is an

theretical approach. In practice you are willing to drop if he has

lost his market. But in some sense you can't lose by taking, while if

you drop you might either win ( if he lost his market) or lose ( if he

fooled you into a drop where you should take). So alway's take is the

strategie without risk.

> "Always take" is not the correct action all the time at 2-away, 2-away. If

> my opponent doubles when my winning chances drop below 30%, or according to

> Snowie, 31.5%, then drop is the correct action.

>

alway's take you can't lose more then opponent has already given away

by not doubling in time. So whatever he is trying to do to you he does

not better then cubing at his first opportunity. This is an

theretical approach. In practice you are willing to drop if he has

lost his market. But in some sense you can't lose by taking, while if

you drop you might either win ( if he lost his market) or lose ( if he

fooled you into a drop where you should take). So alway's take is the

strategie without risk.

If you play cat and mouse make sure you are the cat.

Greetings Artur

Jul 14, 2001, 6:21:27â€¯AM7/14/01

to

Your argument is interesting, but your conclusion "So always take is the

strategy without risk" is still incorrect. If I am doubled in the 2-away,

2-away game, I must ALWAYS decide if I win the current game less than 30% of

the time, and if this is true I must drop. The risk is I'm playing a game

with less than 30% winning chances when I should be playing the next 1 or 2

games instead. However, if I want to avoid this problem, then "always

DOUBLE (not always take)" is the strategy without risk.

strategy without risk" is still incorrect. If I am doubled in the 2-away,

2-away game, I must ALWAYS decide if I win the current game less than 30% of

the time, and if this is true I must drop. The risk is I'm playing a game

with less than 30% winning chances when I should be playing the next 1 or 2

games instead. However, if I want to avoid this problem, then "always

DOUBLE (not always take)" is the strategy without risk.

My preference is to let me opponent risk overshooting his market when he is

the initial favorite in the game and not double while I'm the underdog,

unless I have some nice joker that would suddenly make me the > 70%

favorite.

Gregg

"neghe Onegu" <artur...@yahoo.de> wrote in message

news:58b1388.01071...@posting.google.com...

Jul 14, 2001, 12:58:31â€¯PM7/14/01

to

Are we talking about the EARLY double at -2 v-2?

In this case the "always take without risk" is ALWAYS true because at the

early double there is NO 30% winning chance to consider....... end of

argument!

--

spurs

In this case the "always take without risk" is ALWAYS true because at the

early double there is NO 30% winning chance to consider....... end of

argument!

--

spurs

Roy Passfield @ Oxnard, California

http://www.dock.net/spurs

"Making a living is NOT the same as making a life"

(Roy Passfield, 1999)

"Gregg Cattanach" <gcattana...@prodigy.net> wrote in message

news:H2V37.137$1M3.36...@newssvr15.news.prodigy.com...

Jul 14, 2001, 6:11:15â€¯PM7/14/01

to

Sorry, I didn't read anything about early or late, the statement that I read

was "ALWAYS take is the strategy with no risk."

was "ALWAYS take is the strategy with no risk."

As far as 'early' try this game:

W: 31 8/5 6/5

B: 63 24/18 13/10

W: 66 24/18(2) 13/7*(2)

B: 65 fans.

W: Doubles

If you are doubled here "early", you should drop. Technically, White is too

good to double, (by a tiny margin), but practically should double, as many

players with the Black checkers will take anyway. Also, White made a

tactical error in not doubling before his second shake (as a substantial

favorite with market losers), when Black should have properly taken.

Gregg

"spurs" <sp...@dock.net> wrote in message

news:XS_37.1199$Ka2.4...@monger.newsread.com...

Jul 14, 2001, 9:04:36â€¯PM7/14/01

to

Joe Munger <sdfajk...@fasdjkljklfasd.invalid> wrote in message

news:3B4EC36A...@fasdjkljklfasd.invalid...

> neghe Onegu wrote:

> > Reading about the -2 -2 discussions in the archive I find one point

> > which is wrong in my opinion.

>

> Only one?

> There's a lot in those -2/-2 arguments that leave me asking, "But, but,

> but, how does that follow?" From the reductionist view, the criterion

> for doubling is exactly the same as at other scores -- you want to

> maximize your match-winning probability. It seems that at -2/-2, the

> method for doing that optimization should be real simple, but I don't

> think it actually is...

> However, it is hard to argue that one should not have a

> hair trigger on the cube at that score.

> Jive

news:3B4EC36A...@fasdjkljklfasd.invalid...

> neghe Onegu wrote:

> > Reading about the -2 -2 discussions in the archive I find one point

> > which is wrong in my opinion.

>

> Only one?

> There's a lot in those -2/-2 arguments that leave me asking, "But, but,

> but, how does that follow?" From the reductionist view, the criterion

> for doubling is exactly the same as at other scores -- you want to

> maximize your match-winning probability. It seems that at -2/-2, the

> method for doing that optimization should be real simple, but I don't

> However, it is hard to argue that one should not have a

> hair trigger on the cube at that score.

> Jive

Against a perfect opponent, double at the first sign of a market

loser. That pretty much characterizes a hair trigger.

The optimization is: If your opponent is more likely than you

are to err in his cube handling, i.e. overshoot his doubling point,

then you should wait. (Wait until when? is the question.)

If the converse (you are more likely to err), then you should

simply double on your first roll.

---

Paul T.

Jul 15, 2001, 3:36:42â€¯AM7/15/01

to

Paul Tanenbaum wrote:

>

>

> Against a perfect opponent, double at the first sign of a market

> loser.

>

>

> Against a perfect opponent, double at the first sign of a market

> loser.

I've heard that put forward, but without what I would consider a proof.

Does "the first sign of a market loser" include a 1295 to 1 shot? If

not, where do you draw the line, and how do you prove that that's where

the line should be drawn?

If there is a simple rule, I don't think there's a simple proof of the

rule!

J.

Jul 15, 2001, 1:35:50â€¯PM7/15/01

to

"Gregg Cattanach" <gcattana...@prodigy.net> wrote in message

news:7s347.99$gt6.25...@newssvr15.news.prodigy.com...> Sorry, I didn't read anything about early or late, the statement that I

read

> was "ALWAYS take is the strategy with no risk."

>

> As far as 'early' try this game:

> W: 31 8/5 6/5

> B: 63 24/18 13/10

> W: 66 24/18(2) 13/7*(2)

> B: 65 fans.

> W: Doubles

>

> If you are doubled here "early", you should drop. Technically, White is

too

> good to double, (by a tiny margin), but practically should double, as many

> players with the Black checkers will take anyway. Also, White made a

> tactical error in not doubling before his second shake (as a substantial

> favorite with market losers), when Black should have properly taken.

>

> Gregg

"technically" white should double after black's opening reply.....

and as for "early" doubles...... snowie (and the fish!) doubles at move 1

against any but THE very best openers!

Jul 16, 2001, 4:19:30â€¯AM7/16/01

to

I agree with doubling at the first market looser being the right

strategy. But if you don’t get one in the game you may delay the

double. To make my no risk argument clearer suppose there are three

player’s A B C.

strategy. But if you don’t get one in the game you may delay the

double. To make my no risk argument clearer suppose there are three

player’s A B C.

A always doubles at his first roll.

B doubles at his first market loser and always takes if opponent

doubles

C doubles at his first market loser and decides about the take if

doubled

If A is a strategy without risk so is B as if at some position x

opponent doubles and you take your are in position x with a 2 cube

just as you would have been after A. ( Assuming the state of cube

doesn’t affect the play)

With strategy C doubled at x and take is the same as A and B. The

difference occurs if you drop.

There are two cases either the drop is right and you win or the drop

is wrong and you lose. So if you drop you have the risk to lose

something ( and the chance to gain something) which you don’t

have with A or B. That's the way B is "without risk" and of course

"without gain".

But what’s the point of B you might ask. Well if opponent simply

forgets to double you are better off then in A.

That probably won’t happen often enough to make it a big

consideration. But what you in practice can do is to delay the double

and drop only if it’s perfectly clear to you.

When in doubt take.

You can’t give more away then your opponent has already lost by

not doubling in time.

Play lucky and skilful

Artur

Jul 17, 2001, 11:04:46â€¯PM7/17/01

to

"Jive Dadson" <jda...@ix.netcom.com> wrote in message

news:3B509EE5...@ix.netcom.com...

I believe there is a simple proof of a slightly more general proposition:

against an opponent who handles the cube perfectly, the strategy of

[doubling if there is any chance of losing your market] is optimal.

Proof in two stages:

First, if your opponent handles the cube perfectly, then your chance of

winning the 2 point match is no greater than your chance of winning a 1

point match. The reason is that your opponent can double at his first

opportunity, making your match winning probability equal to your chance of

winning a 1 point match.

Second, by doubling before you lose your market you ensure that your match

winning probability is *at least* equal to your chance of winning a 1 point

match.

QED

If there's a flaw in this argument, I'd certainly be interested in knowing

what it is :-)

-- Walter Trice

Jul 18, 2001, 6:56:10â€¯AM7/18/01

to

Walter Trice <wa...@worldnet.att.net> wrote in message

news:i1757.42256$C81.3...@bgtnsc04-news.ops.worldnet.att.net...

news:i1757.42256$C81.3...@bgtnsc04-news.ops.worldnet.att.net...

> > >Against a perfect opponent, double at the first sign of a market

> > > loser.

> >

> > I've heard that put forward, but without what I would consider a proof.

> > Does "the first sign of a market loser" include a 1295 to 1 shot? If

> > not, where do you draw the line, and how do you prove that that's where

> > the line should be drawn?

> > If there is a simple rule, I don't think there's a simple proof of the

> > rule!

> > > loser.

> >

> > I've heard that put forward, but without what I would consider a proof.

> > Does "the first sign of a market loser" include a 1295 to 1 shot? If

> > not, where do you draw the line, and how do you prove that that's where

> > the line should be drawn?

> > If there is a simple rule, I don't think there's a simple proof of the

> > rule!

The rule is simple, but not the proof, as evidenced by the

consternation the issue generates.

Even Kit Woolsey admits he didn't grasp it when first presented.

> I believe there is a simple proof of a slightly more general proposition:

> against an opponent who handles the cube perfectly, the strategy [of

> doubling if there is any chance of losing your market] is optimal.

> Proof in two stages:

> First, if your opponent handles the cube perfectly, then

> your chance of winning the 2 point match is no greater

> than your chance of winning a 1 point match. The

> reason is that your opponent can double at his first

> opportunity, making your match winning probability

> equal to your chance of winning a 1 point match.

> Second, by doubling before you lose your market you

> ensure that your match winning probability is *at least*

> equal to your chance of winning a 1 point match.

> QED

Right. In other words, if the opponent follows the optimal

strategy, he always wins 2 points whenever he wins the game.

Whereas, if you do not follow that strategy, some of your wins

will be only one point, which is clearly sub-optimal.

It's a game theory argument - I know that he knows that I

know that he knows... So you need to take into account

worst case behavior by the opponent, i.e. assume he follows

his most effective strategy. That would be when he always

doubles at any market loser.

How do you know he will follow this strategy? Because he's

as smart as you are, and he knows that you know all this stuff,

and he's afraid that YOU might follow optimal strategy, so to

protect himself, he does the same. Therefore, to protect

yourself, you do also.

Simple, huh?

> If there's a flaw in this argument, I'd certainly be interested

> in knowing what it is :-)

There is one non-critical flaw, which to my knowledge has never

been recognized.

As JD pointed out, when searching for 'market losers', one must

look at all 36 x 36 2-roll sequences. Now imagine X on roll in

an extremely volatile position, where he has some market losers.

Optimal strategy dicatates that he double. But let's say that some

of his rolls immediately leave O in a position to double, such that

X must drop. Call these 'anti-market losers'. Clearly, if X

doubles, then rolls one of these, he regrets doubling.

So in this situation, when X considers doubling, X must compare

the equity gain of his market losers, against the equity loss of his

anti-market losers.

However, if both players are actually playing the optimal strategy,

this situation should never arise - there had to be some error, i.e.

deviation from optimal, earlier. So theoretically it doesn't shoot

down the optimal strategy argument.

Anyway, such volatile positions must be very rare.

More interesting is the fact that in practice, players often do not

play this way. Either they assume their opponent doesn't get it,

or they assume the opponent believes that they don't get it. Then

it becomes a kind of cat and mouse game, a variant of the prisoner's

dilemma problem, with no easy solution.

---

Paul T.

Jul 18, 2001, 4:59:09â€¯PM7/18/01

to

Paul Tanenbaum wrote:

> Walter Trice <wa...@worldnet.att.net> wrote in message

> news:i1757.42256$C81.3...@bgtnsc04-news.ops.worldnet.att.net...

> > "Jive Dadson" <jda...@ix.netcom.com> wrote in message

> > news:3B509EE5...@ix.netcom.com...

> > > >Against a perfect opponent, double at the first sign of a market

> > > > loser.

> > >

> > > I've heard that put forward, but without what I would consider a proof.

> > > Does "the first sign of a market loser" include a 1295 to 1 shot? If

> > > not, where do you draw the line, and how do you prove that that's where

> > > the line should be drawn?

> > > If there is a simple rule, I don't think there's a simple proof of the

> > > rule!

>

> The rule is simple, but not the proof, as evidenced by the

> consternation the issue generates.

> Even Kit Woolsey admits he didn't grasp it when first presented.

>

> > I believe there is a simple proof of a slightly more general proposition:

> > against an opponent who handles the cube perfectly, the strategy [of

> > doubling if there is any chance of losing your market] is optimal.

Actually, I'm not sure I agree with the more general proposition. I'd like to

see a definition of "perfect cube handling" and "any chance of losing your

market" first. Should you double if there is a chance that your opponent's

checker misplay will cause you to lose your market, but only because you are

known to play badly while leading Crawford 2-away?

> > Proof in two stages:

> > First, if your opponent handles the cube perfectly, then

> > your chance of winning the 2 point match is no greater

> > than your chance of winning a 1 point match. The

> > reason is that your opponent can double at his first

> > opportunity, making your match winning probability

> > equal to your chance of winning a 1 point match.

One implicit assumption is that there is no market-losing sequence of two

initial rolls. That's not a priori obvious, but no reasonable plays seem to

come close. What about 3-1 8/5 6/5 followed by 4-4 6/2(4)? I'd pass.

> > Second, by doubling before you lose your market you

> > ensure that your match winning probability is *at least*

> > equal to your chance of winning a 1 point match.

> > QED

>

> Right. In other words, if the opponent follows the optimal

> strategy, he always wins 2 points whenever he wins the game.

> Whereas, if you do not follow that strategy, some of your wins

> will be only one point, which is clearly sub-optimal.

It's not immediately clear, since some of those 1-point wins could have been

turned around. It is important that the match equity table be correct.

I don't think the following terminology is standard:

A strategy is semi-perfect if it wins at least half of the time when playing

any other strategy.

Then always doubling at the first opportunity is semiperfect (combined with

perfect DMP checker play).

Always doubling whenever there is a 1295:1 chance to lose one's market is

semi-perfect (combined with perfect DMP checker play).

Always taking, combined with either of the above, is semi-perfect.

These are not perfect in that they do not extract the most out of an opponent's

errors, but they will win 50% of the time against each other or against perfect

play.

If one's checker play is not perfect (or semi-perfect?), then I don't know what

doubling strategy is correct.

> [...]

>

> > If there's a flaw in this argument, I'd certainly be interested

> > in knowing what it is :-)

>

> There is one non-critical flaw, which to my knowledge has never

> been recognized.

Actually, it's in the rec.games.backgammon archive

http://www.bkgm.com/rgb/rgb.cgi?view+488 , and has also been discussed on the

Gammonline bulletin board.

> [...]

> More interesting is the fact that in practice, players often do not

> play this way. Either they assume their opponent doesn't get it,

> or they assume the opponent believes that they don't get it. Then

> it becomes a kind of cat and mouse game, a variant of the prisoner's

> dilemma problem, with no easy solution.

No, the prisoner's dilemma is quite different. Backgammon (in this context) is

0-sum. Your gain is your opponent's loss. In the prisoner's dilemma, your gain

imposes a larger loss upon your opponent and vice versa, so that both could be

better off from cooperating.

Douglas Zare

Jul 19, 2001, 11:10:29â€¯AM7/19/01

to

"Douglas Zare" <za...@math.columbia.edu> wrote in message

news:3B55F89D...@math.columbia.edu...

Def'n of perfect cube handling: always makes the double and take decisions

that maximize match winning chance.

Any chance of losing your market: yes, that would include losing your market

as a result of opponent's bad checker play. Yes, even if you could only lose

your market because you would play badly in the Crawford game.

>

> > > Proof in two stages:

> > > First, if your opponent handles the cube perfectly, then

> > > your chance of winning the 2 point match is no greater

> > > than your chance of winning a 1 point match. The

> > > reason is that your opponent can double at his first

> > > opportunity, making your match winning probability

> > > equal to your chance of winning a 1 point match.

>

> One implicit assumption is that there is no market-losing sequence of two

> initial rolls. That's not a priori obvious, but no reasonable plays seem

to

> come close. What about 3-1 8/5 6/5 followed by 4-4 6/2(4)? I'd pass.

Yes, that is an implicit assumption. Whether the assumption is met in all

practical circumstances is a side issue.

I'm still interested in how to get around the simple argument that by

adopting a strategy of doubling before market loss Player B can limit A's

chance to X, while by adopting the same strategy A can ensure that his

chance is X, X being his chance of winning a 1 point match. It seems to me

that this works even with pathological assumptions about checker play,

relative skill, and match equities at the C/2-away scores.

--walter trice

Jul 23, 2001, 6:16:08â€¯AM7/23/01

to

> Paul Tanenbaum wrote:

> > Walter Trice <wa...@worldnet.att.net> wrote in message

> > news:i1757.42256$C81.3...@bgtnsc04-news.ops.worldnet.att.net...

> > Walter Trice <wa...@worldnet.att.net> wrote in message

> > news:i1757.42256$C81.3...@bgtnsc04-news.ops.worldnet.att.net...

> > > > >Against a perfect opponent, double at the first sign of a market

> > > > > loser.

> > > >

> > > > I've heard that put forward, but without what I would consider a proof.

> > > > Does "the first sign of a market loser" include a 1295 to 1 shot? ...> > > > > loser.

> > > >

> > > > I've heard that put forward, but without what I would consider a proof.

> > > > If there is a simple rule, I don't think there's a simple proof of the

> > > > rule!

> >

> > The rule is simple, but not the proof, as evidenced by the

> >

> > > I believe there is a simple proof of a slightly more general proposition:

> > > against an opponent who handles the cube perfectly, the strategy [of

> > > doubling if there is any chance of losing your market] is optimal.

> > > Proof in two stages:

> > > First, if your opponent handles the cube perfectly, then

> > > your chance of winning the 2 point match is no greater

> > > than your chance of winning a 1 point match. The

> > > reason is that your opponent can double at his first

> > > opportunity, making your match winning probability

> > > equal to your chance of winning a 1 point match.

>

> One implicit assumption is that there is no market-losing sequence of two

> initial rolls. That's not a priori obvious, but no reasonable plays seem to

> come close. What about 3-1 8/5 6/5 followed by 4-4 6/2(4)? I'd pass.

> > > First, if your opponent handles the cube perfectly, then

> > > your chance of winning the 2 point match is no greater

> > > than your chance of winning a 1 point match. The

> > > reason is that your opponent can double at his first

> > > opportunity, making your match winning probability

> > > equal to your chance of winning a 1 point match.

>

> One implicit assumption is that there is no market-losing sequence of two

> initial rolls. That's not a priori obvious, but no reasonable plays seem to

> come close. What about 3-1 8/5 6/5 followed by 4-4 6/2(4)? I'd pass.

If I were the opener, I'd play on as too good to double.

A better example to make your point would be opening 31,

followed by opponent's 33, played {24/18, 24/21, 6/3}. I

don't think that's too good.

> > > Second, by doubling before you lose your market you

> > > ensure that your match winning probability is *at least*

> > > equal to your chance of winning a 1 point match.

> > > QED

> >

> > Right. In other words, if the opponent follows the optimal

> > strategy, he always wins 2 points whenever he wins the game.

> > Whereas, if you do not follow that strategy, some of your wins

> > will be only one point, which is clearly sub-optimal.

>

> It's not immediately clear, since some of those 1-point wins could have been

> turned around. It is important that the match equity table be correct.

Of course. The match equity table determines the optimal

doubling point, which in turn defines the 'market loser' concept.

What I think you mean, is that the standard match equity

table(s) be applicable for all players, which cannot be true. So

to be precise, we need to construct a distinct table for every

pair of players.

No big deal, eh?

> A strategy is semi-perfect if it wins at least half of the time when playing

> any other strategy.

>

> Then always doubling at the first opportunity is semiperfect (combined with

> perfect DMP checker play).

> Always doubling whenever there is a 1295:1 chance to lose one's market is

> semi-perfect (combined with perfect DMP checker play).

> Always taking, combined with either of the above, is semi-perfect.

>

> These are not perfect in that they do not extract the most out of an opponent's

> errors, but they will win 50% of the time against each other or against perfect

> play.

> If one's checker play is not perfect (or semi-perfect?), then I don't know what

> doubling strategy is correct.

You merely adjust the match equity table to account for one's errors.

> > [...]

> >

> > > If there's a flaw in this argument, I'd certainly be interested

> > > in knowing what it is :-)

> >

> > There is one non-critical flaw, which to my knowledge has never

> > been recognized...

>

> Actually, it's in the rec.games.backgammon archive

> http://www.bkgm.com/rgb/rgb.cgi?view+488 , and has also

> been discussed on the Gammonline bulletin board.

I stand corrected.

> > [...]

> > More interesting is the fact that in practice, players often do not

> > play this way. Either they assume their opponent doesn't get it,

> > or they assume the opponent believes that they don't get it. Then

> > it becomes a kind of cat and mouse game, a variant of the prisoner's

> > dilemma problem, with no easy solution.

>

> No, the prisoner's dilemma is quite different. Backgammon is

> 0-sum. Your gain is your opponent's loss. In the prisoner's

> dilemma, your gain imposes a larger loss upon your

> opponent and vice versa, so that both could be better off

> from cooperating.

I see, the prisoner's dilemma cannot apply to zero-sum games.

But there is some analogy, in that the correct strategy

depends on what you think the other guy will do.

Here is an example, from a recent Gammonvillage article:

Match to 13; X:11 O: 11

X to play (6 3)

+24-23-22-21-20-19-------18-17-16-15-14-13-+

| O O O O O | | X X |

| O O O O O | | X X |

| O O O O | | |

| | | | ---

| | | | | 1 |

| | | | ---

| | | |

| | | |

| X | | |

| X X X X | | |

| X X X X X X | | O |

--------------------------------------------

+-1--2--3--4--5--6--------7--8--9-10-11-12-+

From the Michigan Summer Champs final.

Both players experts. It's unlikely they didn't know

optimal strategy. How did they get to this point without

doubling? It's the mind game again... I won't double

immediately if you won't...

---

Paul T.

Jul 24, 2001, 3:36:02â€¯PM7/24/01

to

The common wisdom is that you can never lose by doubling immediately

at 2 away 2 away. Even if you have waited and are a moderate underdog

in the game, you are probably better off turning the cube asap. Here

is why:

at 2 away 2 away. Even if you have waited and are a moderate underdog

in the game, you are probably better off turning the cube asap. Here

is why:

If you continue to "lose ground" and you approach the 75-25 underdog

status that would make a pass as good as a take, your opponent may try

to steal some equity from you by doubling when he is (actually) only a

70-30 favorite.

If you are VERY confident you can tell the difference, OK - but I saw

a pretty good player pass one of these when his position was well

worth a take.

At this very point, if he had doubled on his first roll, he would now

be 30% to win the match. By letting himself be bluffed out, he starts

a new game, and is 25% to win the match.

Be careful when you are matched against a top expert and he holds off

doubling.

dk

Jul 25, 2001, 4:00:23â€¯AM7/25/01

to

If I understand correctly, your considerations are correct from a

general point of view (be careful when playing experts, wait to double

if you hold the cube better than the opponent) but wrong from a

technical point of view.

general point of view (be careful when playing experts, wait to double

if you hold the cube better than the opponent) but wrong from a

technical point of view.

Actually the break-even point for a take at 2aw-2aw is not 25% but

more than that because of the possibility to gammon the opponent at

Crawford game.

I believe Snowie evaluates the take point at 30.5%, usually 30% is a

good esteem.

If I misunderstood your message then forget my follow-up.

Best regards,

Carlo Melzi

Donald Kahn <don...@easynet.co.uk> wrote in message news:<joirltgc0l20ntiog...@4ax.com>...

Jul 25, 2001, 3:23:42â€¯PM7/25/01

to

On 25 Jul 2001 01:00:23 -0700, cam...@tin.it (Carlo Melzi) wrote:

>If I understand correctly, your considerations are correct from a

>general point of view (be careful when playing experts, wait to double

>if you hold the cube better than the opponent) but wrong from a

>technical point of view.

>

>Actually the break-even point for a take at 2aw-2aw is not 25% but

>more than that because of the possibility to gammon the opponent at

>Crawford game.

>I believe Snowie evaluates the take point at 30.5%, usually 30% is a

>good esteem.

>

>If I misunderstood your message then forget my follow-up.

>

>Best regards,

>Carlo Melzi

>

You are quite right and I goofed. He can pass and have 30.5% in the

match, but if in the -2 -2 game, he "really" has 33% when doubled

but misjudges and thinks he has less than 30%, and passes the double,

he has been victimized.

Thanks for the correction.

dk

Jul 25, 2001, 8:07:18â€¯PM7/25/01

to

Paul Tanenbaum wrote:

> Douglas Zare <za...@math.columbia.edu> wrote in message news:<3B55F89D...@math.columbia.edu>...

> > Paul Tanenbaum wrote:

> > > Walter Trice <wa...@worldnet.att.net> wrote in message

> > > news:i1757.42256$C81.3...@bgtnsc04-news.ops.worldnet.att.net...

> > > > > >Against a perfect opponent, double at the first sign of a market

> > > > > > loser.

>

> > One implicit assumption is that there is no market-losing sequence of two

> > initial rolls. That's not a priori obvious, but no reasonable plays seem to

> > come close. What about 3-1 8/5 6/5 followed by 4-4 6/2(4)? I'd pass.

>

> If I were the opener, I'd play on as too good to double.

Snowie evaluations say double/pass, but that it is close. Rollouts indicate that it is too good to

double/pass, a situation which never occurs under optimal play. This indicates that suboptimal checker

play increases the complexity of -2:-2.

> A better example to make your point would be opening 31,

> followed by opponent's 33, played {24/18, 24/21, 6/3}. I

> don't think that's too good.

That is still an easy take.

Still, the proposition under discussion is whether one should double with any market losers against a

perfect opponent, who would not make either of these responses to 3-1. Should you double if you have a

1/1296 market loser, but are more likely to err so that you get to pass next turn?

> > > > Second, by doubling before you lose your market you

> > > > ensure that your match winning probability is *at least*

> > > > equal to your chance of winning a 1 point match.

> > > > QED

> > >

> > > Right. In other words, if the opponent follows the optimal

> > > strategy, he always wins 2 points whenever he wins the game.

> > > Whereas, if you do not follow that strategy, some of your wins

> > > will be only one point, which is clearly sub-optimal.

> >

> > It's not immediately clear, since some of those 1-point wins could have been

> > turned around. It is important that the match equity table be correct.

>

> Of course. The match equity table determines the optimal

> doubling point, which in turn defines the 'market loser' concept.

>

> What I think you mean, is that the standard match equity

> table(s) be applicable for all players, which cannot be true. So

> to be precise, we need to construct a distinct table for every

> pair of players.

> No big deal, eh?

I'm not sure what you said in the summary of what you think I mean. What I mean is that one can take

advantage of someone whose match equity table is wrong. Your opponent might have perfect DMP checker

play, never lose their market, and still win fewer than 50% of the matches from -2:-2 because they don't

take when they should. If they take everything, then they will always win at least 50%, even if some of

the takes are technical errors. To be safe if you aren't confident of your match equity table (but have

perfect DMP checker play), always take, and your errors will be smaller than your opponent's.

I don't think that's obvious, or in the archives. In fact, many people seem to believe that always

taking is not semiperfect, that one can extract equity from someone's bad takes at -2:-2. (Assuming

perfect checker play, in some sense one can, but only by making worse errors on average. It's not worth

it to look for bad takes.)

What is the precise statement for imperfect play versus perfect play, and what is the proof? Do you

claim that against anyone who plays semiperfectly, the entry in your personal match equity table for

-2:-2 should be the same as for DMP? Be careful that many of the terms being used have multiple possible

definitions.

Douglas Zare

Reply all

Reply to author

Forward

0 new messages

Search

Clear search

Close search

Google apps

Main menu