0 views

Skip to first unread message

Dec 9, 1996, 3:00:00â€¯AM12/9/96

to

Would some experts like to comment on this -2 -2 game? Neither of us

doubled immediately, and it took some time for a doubling situation to

arise, as far as I can see.

I don't double immediately at -2 -2 as I want to exploit any error my

opponent might make by not doubling at this score when he gains an

advantage.

I'm X. I haven't re-played this match, and perhaps O should have doubled

before I did, but I wasn't happy doubling until I did.

Score is 3-3 in a 5 point match.

X: (4 1) 12-16 1-2

O: (4 1) 6-2 2-1

X: (2 4) bar-4 bar-2

O: (2 2) 6-4 6-4 24-22 22-20

X: (3 2) bar-2 17-20

O: (2 6) bar-23 24-18

X: (2 2) 16-18 18-20 19-21 19-21

O: (2 3) bar-23 13-10

X: (5 2) 12-17 12-14

O: (3 1) 10-7 8-7

X: (6 5) 14-20 17-22

O: (4 5) 13-8 13-9

X: (4 6) 12-18 18-22

O: (6 2) 13-7 13-11

X: (3 4) 2-5 5-9

O: (1 2) bar-24 11-9

X: (3 3) bar-3 2-5 19-22 12-15

O: (5 2) 7-5 8-3

X: doubles

O: accepts

[snip]

X won.

--

_ N : E : T : A : D : E : L : I : C : A

James Eibisch ('v') -- http://www.revolver.demon.co.uk --

Reading, U.K. (,_,) -- Now showing: Invaders 1978: --

======= a faithful version of Taito's original

Dec 9, 1996, 3:00:00â€¯AM12/9/96

to REMOVE THE *

James Eibisch wrote:

>

> Would some experts like to comment on this -2 -2 game? Neither of us

> doubled immediately, and it took some time for a doubling situation to

> arise, as far as I can see.

>

> I don't double immediately at -2 -2 as I want to exploit any error my

> opponent might make by not doubling at this score when he gains an

> advantage.

>

> I'm X. I haven't re-played this match, and perhaps O should have doubled

> before I did, but I wasn't happy doubling until I did.

A fascinating game at this score, James. I often wonder about similar

situations. I went through it with the aid of Jellyfish to see what was

going on.

First, I'll risk boring those who have been through the endless

discussions on this with a summary of the theory behind doubling at this

match score:

If the reaches 1-away, 2-away Crawford, the trailer's match-winning

chances are generally agreed to be 30%, assuming equal players.

Therefore in order to justify taking you need to have a 30% chance to

win the game (and there's no extra penalty for being gammoned).

In the cases where either player doubles and the other takes, the cube

is dead. Therefore, in all of those cases it really doesn't matter who

doubled, since no one gets any value from redoubling.

Suppose X doubles and O passes. Assuming it's truly a drop, then X

"lost his market". This means X would have been better off having

doubled earlier when O still had a take, so X made a mistake by not

doubling. But, you may ask, suppose X had rolled poorly instead of

well; wouldn't he have been glad he didn't double? No, the theory goes,

because then O would double at a point where X still had a take, so it

wouldn't matter if X had doubled first!

So, technically, if you know your opponent will never lose his market,

then it's never correct to risk losing your market by even a little.

Therefore, you may as well double at your first opportunity and get it

over with.

However, this doesn't take into account the fact that quite often we

play actual human beings (as opposed to computers), and they don't

always judge the situation correctly. As James said, he wanted to give

his opponent a chance to make an error, and this is usually a good

idea! If you double immediately (as Jellyfish does, even as a huge

underdog), your opponent has zero chance to make an error. If you wait

til you are closer to his take point, at some risk of losing your

market, he may erroneously pass, or even if you do lose your market, he

may erroneously take, thus turning your error into a great play.

So let's see what happened in the match. In all cases I looked at what

I thought was the best possible sequence for the player on roll to see

if he could possibly lose his market, using Jellyfish's Level 7

evaluations..

> Score is 3-3 in a 5 point match.

>

> X: (4 1) 12-16 1-2

With X's back men split, can O possibly lose his market? According to

Jellyfish, if O rolls 44 and X fans, X still has 32% winning chances!

Therefore there's no reason to double here.

> O: (4 1) 6-2 2-1

O's best: 11, coming in with both checkers, hitting, and making his 5

point. O then has 30.3 winning chances, still a take.

> X: (2 4) bar-4 bar-2

Best: O 44 and X fans; X wins 32%.

> O: (2 2) 6-4 6-4 24-22 22-20

Best: X 11, O fans; O wins 29.7%.

> X: (3 2) bar-2 17-20

O 55, X fans: X wins 32%

> O: (2 6) bar-23 24-18

In all the above cases, both players seem right to wait since at best

they can lose their market by a minuscule amount; surely the chance of

an opponent error is better.

X 44, O fans: O wins 28.2%. This is the first significant market loser

possible. In fact, if O had fanned after X's actual 22, O would have

had 28.4%. So it's still pretty close to a take; and there's always the

chance that O would take.

> X: (2 2) 16-18 18-20 19-21 19-21

> O: (2 3) bar-23 13-10

OK, now here's the first situation where a double seems clearly correct.

According to JF, O's winning chances are only 35% before the roll, so

it's getting pretty close to a drop. X has a clearly superior blockade,

O's blot on the ace point is a serious drawback, and there are some

market losers. Escaping with 55 results in O having around 25% chances

(depending on his reply); 53, hitting, followed by a fan, results in 23%

chances. At this point, the best chance for O to make an error is to

double him NOW and hope he drops.

> X: (5 2) 12-17 12-14

> O: (3 1) 10-7 8-7

Now, with O having made his bar and leaving no shots, it's probably not

necessary to double anymore.

> X: (6 5) 14-20 17-22

> O: (4 5) 13-8 13-9

Now X has some shots again. If X hits and O fans, O's chances will only

be 23.8%.

> X: (4 6) 12-18 18-22

> O: (6 2) 13-7 13-11

Now, if X hits and O fans, O's chances are only 17%!,

> X: (3 4) 2-5 5-9

> O: (1 2) bar-24 11-9

Again, we're talking serious market losers here, if X hits.

> X: (3 3) bar-3 2-5 19-22 12-15

BTW, Jellyfish likes shifting points with 21/24(2) about 9% better.

> O: (5 2) 7-5 8-3

> X: doubles

> O: accepts

>

Interestingly, Jellyfish liked O's "desperation" play of hitting twice

and leaving 3 home-board blots, and estimates O with 32% winning

chances, so a correct take at this score! (Not for money, of course,

because X wins tons of gammons.)

So: there were some early rolls where it seemed definitely right for

both players to wait, from a tactical point of view. Then there were a

few rolls where it seemed that X was definitely risking too much by

waiting. And after all that: the final position was: correct double,

correct take!

Whew!

Ron

Dec 10, 1996, 3:00:00â€¯AM12/10/96

to

James Eibisch (jeibisch@revolver.*demon.co.uk) wrote:

: Would some experts like to comment on this -2 -2 game? Neither of us

: doubled immediately, and it took some time for a doubling situation to

: arise, as far as I can see.

hmmm? It's not an error to double immediately. I've seen 2 experts

do this.

: I don't double immediately at -2 -2 as I want to exploit any error my

: opponent might make by not doubling at this score when he gains an

: advantage.

I guess you mean you'd like the opportunity to drop if your opponent

doubles late? If you are clearly the stronger player, perhaps this is a

good strategy. However, if you are equal players, dropping leaves you

with a 30% match equity, and if your opponent misses his double by

a wide margin, perhaps you are gammoned to lose the match anyhow. If

you are the weaker player, there's not much to exploit. AND, in

attempting to "exploit" one's opponent's error, you signficantly

increase the chance of finding when you double, your opponent drops.

Or, maybe you had something else in mind here?

: I'm X. I haven't re-played this match, and perhaps O should have doubled

: before I did, but I wasn't happy doubling until I did.

: X: (4 1) 12-16 1-2

Caveat: Mine are distinctly "non expert views!"

O has a double here. For example (4 4) 6-2*/2 24-16*

followed by a dance looks like a market loser to me.

This may still be a money take, but 30% is the take

point for X at this match score.

: O: (4 1) 6-2* 2-1*

: X: (2 4) bar-4 bar-2

: O: (2 2) 6-4* 6-4 24-22 22-20

: X: (3 2) bar-2 17-20*

: O: (2 6) bar-23 24-18

X has a double here. You actually rolled the first

1/2 of a market loser in my view!! You must have had

some second thoughts when 22 popped out of the cup!

A dance by O would have completed the market loser

in my NON expert view. Indeed, if O fails to anchor

on his/her roll, it is still close to a market loser.

: X: (2 2) 16-18 18-20 19-21 19-21 Dynamite roll!

I'd venture that the "exploitor" has just become

the "exploitee". ;:)) wcb on FIBS

Dec 11, 1996, 3:00:00â€¯AM12/11/96

to

>James Eibisch wrote:

(snip)

>> I don't double immediately at -2 -2 as I want to exploit any error my

>> opponent might make by not doubling at this score when he gains an

>> advantage.

(snip)

In article <32ACE0...@best.com>, Ron Karr <ka...@best.com> wrote:

(snip)

>...technically, if you know your opponent will never lose his market,

>then it's never correct to risk losing your market by even a little.

>Therefore, you may as well double at your first opportunity and get it

>over with.

>However, this doesn't take into account the fact that quite often we

>play actual human beings (as opposed to computers), and they don't

>always judge the situation correctly. As James said, he wanted to give

>his opponent a chance to make an error, and this is usually a good

>idea! If you double immediately (as Jellyfish does, even as a huge

>underdog), your opponent has zero chance to make an error. If you wait

>til you are closer to his take point, at some risk of losing your

>market, he may erroneously pass, or even if you do lose your market, he

>may erroneously take, thus turning your error into a great play.

(snip game and commentary)

>> O: (2 3) bar-23 13-10

>OK, now here's the first situation where a double seems clearly correct.

>According to JF, O's winning chances are only 35% before the roll, so

>it's getting pretty close to a drop. X has a clearly superior blockade,

>O's blot on the ace point is a serious drawback, and there are some

>market losers. Escaping with 55 results in O having around 25% chances

>(depending on his reply); 53, hitting, followed by a fan, results in 23%

>chances. At this point, the best chance for O to make an error is to

>double him NOW and hope he drops.

>

(snip the remainder of game and commentary)

Congratulations to both James and Ron for using their heads (that

is, not playing like mindless robots) AND having the guts to question

the "of course it's right to double at your first opportunity" rule

when the score is -2,-2.

James said that by not doubling, you give your opponent a chance

to err by doubling too late (losing his/her market). Good point.

Ron says that by waiting until you are FAVORED and close the

your opponents drop/take line, you give him/her the chance to error

and make the wrong cube decision. Another good point.

I have seen "proofs" that it is correct to double. I have yet

to see a "proof" that didn't have major flaws (IMHO, of course).

Count me among the skeptics. Ron's analysis of this game is the

best analysis that I have seen on the -2,-2 score. Admittedly it

is only a single game, and it certainly doesn't prove that waiting

is correct (and note that Ron didn't claim it was a proof, either).

Keep up the good work, guys.

Chuck

bo...@bigbang.astro.indiana.edu

c_ray on FIBS

p.s. My "mindless robot" slam was not pointed towards JF, TD-G, or

Loner. I consider them "mindful"!

Dec 12, 1996, 3:00:00â€¯AM12/12/96

to

Chuck Bower <bo...@bigbang.astro.indiana.edu> wrote in article

<58nh4c$9...@dismay.ucs.indiana.edu>...

> I have seen "proofs" that it is correct to double. I have yet

> to see a "proof" that didn't have major flaws (IMHO, of course).

> Count me among the skeptics. Ron's analysis of this game is the

> best analysis that I have seen on the -2,-2 score. Admittedly it

> is only a single game, and it certainly doesn't prove that waiting

> is correct (and note that Ron didn't claim it was a proof, either).

The reason the favorite doubles at -2, -2 is that his advantage

applies to the entire match if he doubles, whereas his advantage

is 70% diluted if he does not double. (70% being the chance of

winning the match from the Crawford game.)

Let's take Ron's two cases, and consider them carefully.

> If you wait til you are closer to his take point he may erroneously pass

An erroneous pass is, of course, good for you, since you turn a

sub-70% chance of winning the match into a 70% chance of winning

the match. This is a good thing.

> even if you do lose your market, he may erroneously take, thus turning

> your error into a great play.

But this error makes no difference, since an erroneous take is merely

a transposition to the same situation where you have doubled before

and he properly took, then got unlucky. All you accomplished here is

to get a break that exactly compensated for your error in waiting so

long to double.

So the error you are hoping for is for the opponent to pass when

he should not. The other error (taking when he should not) does not

improve your chances of winning the match compared to doubling

immediately.

It follows that a savvy opponent can make a hope-for-an-error strategy

moot by simply taking all doubles at -2, -2. Surprisingly, this "take-

everything" defense reduces the hope-for-an-error strategy to exactly

the winning percentage of the double-immediately strategy. The existence

of such a strategy for the underdog makes the hope-for-an-error strategy

seem dubious to me.

Of course you can come up with examples where the opponent dropped

when he should have taken. Heck, I'm sure I will make that mistake

fairly often. But will you have enough cases to compensate

for your market losers? I dunno...

I continue to regard the ideal strategy to be to double if you

have any market losers. In other words, if there is *any* chance you

will lose your market, then you should double.

Brian

Dec 13, 1996, 3:00:00â€¯AM12/13/96

to

Chuck Bower <bo...@bigbang.astro.indiana.edu> wrote in article

<58nh4c$9...@dismay.ucs.indiana.edu>...

> I have seen "proofs" that it is correct to double. I have yet

> to see a "proof" that didn't have major flaws (IMHO, of course).

> Count me among the skeptics. Ron's analysis of this game is the

> best analysis that I have seen on the -2,-2 score. Admittedly it

> is only a single game, and it certainly doesn't prove that waiting

> is correct (and note that Ron didn't claim it was a proof, either).

A proof was pointed out by Fredrik Dahl:

Make it a proposition: We play 2-pointers with the following strategy:

You don't double, but I double. If you win more than 50%, than only

because you are stronger.

All these 2-point-matches will become double match point, of course.

Alexander (acey_deucey@FIBS)

Dec 13, 1996, 3:00:00â€¯AM12/13/96

to

Brian Sheppard wrote:

>

> Let's take Ron's two cases, and consider them carefully.

>

> > If you wait til you are closer to his take point he may erroneously pass

>

> An erroneous pass is, of course, good for you, since you turn a

> sub-70% chance of winning the match into a 70% chance of winning

> the match. This is a good thing.

>

> > even if you do lose your market, he may erroneously take, thus turning

> > your error into a great play.

>

> > your error into a great play.

>

> But this error makes no difference, since an erroneous take is merely

> a transposition to the same situation where you have doubled before

> and he properly took, then got unlucky. All you accomplished here is

> to get a break that exactly compensated for your error in waiting so

> long to double.

>

> So the error you are hoping for is for the opponent to pass when

> he should not. The other error (taking when he should not) does not

> improve your chances of winning the match compared to doubling

> immediately.

>

> a transposition to the same situation where you have doubled before

> and he properly took, then got unlucky. All you accomplished here is

> to get a break that exactly compensated for your error in waiting so

> long to double.

>

> So the error you are hoping for is for the opponent to pass when

> he should not. The other error (taking when he should not) does not

> improve your chances of winning the match compared to doubling

> immediately.

>

Yes, Brian is absolutely right that if the opponent takes incorrectly

after you've waited too long to double, it isn't a gain, compared to

doubling immediately. What I should have said was "if he takes when he

should have dropped, you haven't lost anything by waiting." It's simply

something that makes "losing your market" not as costly as it would be

otherwise.

In fact, I think the other error, that O will drop erroneously, is also

not the big reason to delay doubling, although it contributes. The main

reason is that he may fail to double if HE has market losers. Then

there's a chance that I may be able to get out for 1 point in a

situation where I have less than 30% winning chances. I know it's

happened numerous times against weaker players. (In fact, if I'm

playing a weaker player, my take point is actually higher than 30%, and

his is lower, so it's going to be easier for him to lose his market, at

least in relatively non-skill positions.)

How do you know whether there's a chance any particular opponent will

wait too long to double? You don't for sure, but there's a pretty good

indication: if you fail to double when you have NO market losers (as

appeared to be the case in the first few moves of the game James

reported), you get a chance to see whether he doubles you! If he

doesn't, there's at least the possibility that he might wait too long.

Then you can decide whether to risk waiting when you have ANY market

losers.

And yes, I totally agree that if your opponent is going to double right

away, there's no gain in waiting yourself.

Ron

Dec 13, 1996, 3:00:00â€¯AM12/13/96

to

In article <32B16A...@mailszrz.zrz.tu-berlin.de>,

Alexander Nitschke <nits...@mailszrz.zrz.tu-berlin.de> wrote:

>

>A proof (that doubling at your first opportunity is correct at

2-away, 2-away) was pointed out by Fredrik Dahl:

>

>Make it a proposition: We play 2-pointers with the following strategy:

>You don't double, but I double. If you win more than 50%, than only

>because you are stronger.

>All these 2-point-matches will become double match point, of course.

>

>Make it a proposition: We play 2-pointers with the following strategy:

>You don't double, but I double. If you win more than 50%, than only

>because you are stronger.

>All these 2-point-matches will become double match point, of course.

>

OK, last night I went to INDY to play in the club game and didn't

get home until 3:00 AM. Got up at 7:30 and put in a full day's work.

As a result, my mind is a bit slow at the moment, BUT

Would someone PLEASE explain how this is "proof" that it is correct

to double immediately at -2, -2. PLEASE, PLEASE, PLEASE!!

Dec 14, 1996, 3:00:00â€¯AM12/14/96

to

In article <01bbe83e$c3771bc0$3ac0...@polaris.mstone.com>,

Brian Sheppard <bri...@mstone.com> wrote:

>

(regarding the idea of NOT doubling until you are a substantial

favorite--near 70%--in this game. CRB)

(snip)

>So the error you are hoping for is for the opponent to pass when

>he should not.

>

>It follows that a savvy opponent can make a hope-for-an-error strategy

>moot by simply taking all doubles at -2, -2. Surprisingly, this "take-

>everything" defense reduces the hope-for-an-error strategy to exactly

>the winning percentage of the double-immediately strategy. The existence

>of such a strategy for the underdog makes the hope-for-an-error strategy

>seem dubious to me.

>

I agree that this strategy does reduce the situation to being

equivalent to double-take at the first opportunity. But if THAT strategy

(i.e. double at first opportunity) is wrong, then the above proposed

strategy is also wrong.

Interested r.g.bg readers, please take a look at the following

hypothetical problem and see if it sheds light on Brian's strategy:

Suppose an LV casino was feeling generous (ha, ha) and set up the

following game: You bet $1 on the outcome of one roll of a single

die. You have your choice before rolling of betting on 1-6 and

receiving $3 if either 1 or 6 comes up, OR you can bet 2-3-4-5 and

receive $1.50 if any of those numbers comes up. If one of your

chosen numbers does NOT come up, the house keeps your $1.

Suppose it's Christmastime (hey, it is!) and the casino is feeling

downright charitable (HO, HO, HO!) and they make the following small

change in the rule: after you roll the die, but before seeing

what number wins, they will tell you one number which DIDN'T

come up. Then, if you like, you can CHANGE your choice (from 1-6

to 2-3-4-5, or vice versa). Would a "savvy opponent" decline their

offer?

Dec 14, 1996, 3:00:00â€¯AM12/14/96

to

Chuck Bower (bo...@bigbang.astro.indiana.edu) wrote:

: Would someone PLEASE explain how this is "proof" that it is correct

: to double immediately at -2, -2. PLEASE, PLEASE, PLEASE!!

Let's try it this way:

Suppose you and I are playing a match, and we reach -2, -2. My strategy

is as follows:

When it is my turn to roll, I will examine all 1296 possible outcomes of

the next (I roll, you roll) sequence. If there is as much as one

possible outcome which would cause me to lose my market, I will double.

If there are no such outcomes, I will not double.

Now, suppose you use ANY strategy which is more conservative in doubling

than my strategy. Then, I claim that you are at a disadvantage. My

strategy guarantees that it is impossible for me to lose my market. Your

more conservative strategy allows the possibility of you losing your

market. Therefore, the only possible outcomes of the game are:

1) The cube is turned, accepted, and the game is played for the match.

2) You double, and I correctly pass (when you have lost your market).

If 2) happens, you wish you had doubled earlier. Otherwise, there is no

difference. Consequently, doubling immediately must be superior to a

strategy which risks losing ones market.

Kit

Dec 16, 1996, 3:00:00â€¯AM12/16/96

to

In article <58nh4c$9...@dismay.ucs.indiana.edu>, bo...@bigbang.astro.indiana.edu (Chuck Bower) writes:

|>

|> James said that by not doubling, you give your opponent a chance

|> to err by doubling too late (losing his/her market). Good point.

|>

|> Ron says that by waiting until you are FAVORED and close the

|> your opponents drop/take line, you give him/her the chance to error

|> and make the wrong cube decision. Another good point.

I fully agree with the two above points, let me add a third good point:

By waiting until you are close to your opponents drop/take line you

give *yourself* the chance to error and lose your market.

And a fourth point:

By waiting to double, you must at every play consider the cube, whereas

if you double immediately, you can concentrate fully on the checker play.

In other words you should consider who is the most likely to make a cube error.

Speaking for myself, this kind of reasoning very often leads to early

doubles at -2, -2 :-)

--

stein....@fou.telenor.no

... signature funny quote (and more) at http://www.nta.no/brukere/stein

Dec 16, 1996, 3:00:00â€¯AM12/16/96

to

<58usil$c...@dismay.ucs.indiana.edu>...

This is a well-known problem (the "Monty Hall problem," named after

the "Let's Make A Deal" host). The answer to the problem depends on

the strategy used in selecting the number that the casino will tell you.

If you make no assumption about the distribution then you cannot

do better than to achieve your original equity (which was 0 no matter

what your choice was).

If you make different assumptions (for example, that the casino will

randomly choose a number to tell you) then you have a different answer.

But I do not see the relevance of this problem to the -2, -2 situation.

Can you clarify, please?

Brian

Dec 16, 1996, 3:00:00â€¯AM12/16/96

to

>Chuck Bower <bo...@bigbang.astro.indiana.edu> wrote in article

><58usil$c...@dismay.ucs.indiana.edu>...

>>

>>Suppose an LV casino was feeling generous (ha, ha) and set up the

>>following game: You bet $1 on the outcome of one roll of a single

>>die. You have your choice before rolling of betting on 1-6 and

>>receiving $3 if either 1 or 6 comes up, OR you can bet 2-3-4-5 and

>>receive $1.50 if any of those numbers comes up. If one of your

>>chosen numbers does NOT come up, the house keeps your $1.

>>Suppose it's Christmastime (hey, it is!) and the casino is feeling

>>downright charitable (HO, HO, HO!) and they make the following small

>>change in the rule: after you roll the die, but before seeing

>>what number wins, they will tell you one number which DIDN'T

>>come up. Then, if you like, you can CHANGE your choice (from 1-6

>>to 2-3-4-5, or vice versa). Would a "savvy opponent" decline their

>>offer?

In article <01bbeb61$d7fc1e80$3ac0...@polaris.mstone.com>,

Brian Sheppard <bri...@mstone.com> wrote:

>This is a well-known problem (the "Monty Hall problem," named after

>the "Let's Make A Deal" host). The answer to the problem depends on

>the strategy used in selecting the number that the casino will tell you.

>

>If you make no assumption about the distribution then you cannot

>do better than to achieve your original equity (which was 0 no matter

>what your choice was).

>

>If you make different assumptions (for example, that the casino will

>randomly choose a number to tell you) then you have a different answer.

>

>But I do not see the relevance of this problem to the -2, -2 situation.

>Can you clarify, please?

>

>Brian

Firstly, although I've heard of Monty Hall (host of TV's Let's Make

a Deal"), I've never heard of the "Monty Hall PROBLEM". I thought the

above up on Saturday to try and illustrate a point (by analogy to the

-2, -2 doubling strategy) but apparently failed miserably.

Secondly, I don't undertand what you (Brian) mean when you say:

"If you make no assumption about the distribution then you cannot

do better than to achieve your original equity". My feeling is that you

should ACCEPT the casino offer and then change your bet as follows:

Case 1: You originally chose 1-6. Do the following:

number casino says DIDN'T come up: your action:

1 change to 2-3-4-5

2 stay with 1-6

3 stay with 1-6

4 stay with 1-6

5 stay with 1-6

6 change to 2-3-4-5

Case 2: you originally chose 2-3-4-5. Do the following:

number casino says DIDN'T come up: your action:

1 stay with 2-3-4-5

2 change to 1-6

3 change to 1-6

4 change to 1-6

5 change to 1-6

6 stay with 2-3-4-5

In either case, your expected return is $1.50 so your equity is +$0.50.

(I agree with you that your equity is 0 if you aren't offered the

"Christmastime bonus".)

NOW, I explain why I made this analogy in the first place. Brian

proposed (please correct me if I'm wrong) that if a player tried to get

around the "double at first opportunity when playing a match which

reaches the -2,-2 score" by waiting until his/her lead was close the

drop take line, the opponent should always take. That way the opponent

would be exactly to where he/she would have been if the offerer of the

double had actually doubled at first opportunity (and the receiver of the

double took then).

My contention is that the person has a decision to make as to whether

or not to take this cube (near the drop/take line) and s/he should try and

TAKE ADVANTAGE OF ALL OF THE INFORMATION PRESENTED. Decide whether the

position offers him/her better than a 30% chance in this game. If better,

then take, otherwise pass. This seems better than just blindly accepting.

My analogy was SUPPOSED to illustrate a case where the casino player

could either decline the casino's generous offer (in which case his/her

eqity would remain 0), OR TAKE ADVANTAGE OF ALL THE INFORMATION PRESENTED

and up the player's equity to +$0.50.

Dec 16, 1996, 3:00:00â€¯AM12/16/96

to

Ron Karr <ka...@apple.com> wrote in article <32B150...@apple.com>...

> How do you know whether there's a chance any particular opponent will

> wait too long to double? You don't for sure, but there's a pretty good

> indication: if you fail to double when you have NO market losers (as

> appeared to be the case in the first few moves of the game James

> reported), you get a chance to see whether he doubles you! If he

> doesn't, there's at least the possibility that he might wait too long.

> Then you can decide whether to risk waiting when you have ANY market

> losers.

>

I agree that there is no point in doubling if you have no market

losers.

But the question is the other way: when you do have market losers,

when should you double? I say that if you always double, then you

can't lose anything.

How does your strategy differ?

Brian

Dec 17, 1996, 3:00:00â€¯AM12/17/96

to

bo...@bigbang.astro.indiana.edu (Chuck Bower) writes:

> Secondly, I don't undertand what you (Brian) mean when you say:

> "If you make no assumption about the distribution then you cannot

> do better than to achieve your original equity". My feeling is that you

> should ACCEPT the casino offer and then change your bet as follows:

>

> Case 1: You originally chose 1-6. Do the following:

>

> number casino says DIDN'T come up: your action:

>

> 1 change to 2-3-4-5

> 2 stay with 1-6

> 3 stay with 1-6

> 4 stay with 1-6

> 5 stay with 1-6

> 6 change to 2-3-4-5

>

> Case 2: you originally chose 2-3-4-5. Do the following:

>

> number casino says DIDN'T come up: your action:

>

> 1 stay with 2-3-4-5

> 2 change to 1-6

> 3 change to 1-6

> 4 change to 1-6

> 5 change to 1-6

> 6 stay with 2-3-4-5

>

> In either case, your expected return is $1.50 so your equity is +$0.50.

> (I agree with you that your equity is 0 if you aren't offered the

> "Christmastime bonus".)

As far as I can tell, you *are* making assumptions about the strategy

employed by the casino in choosing which non-roll to declare, though

it's not clear exactly what your assumption is. If we assume that the

casino's strategy is to choose at random between the five numbers not

rolled, then your strategy gives you an equity of $0.20 per game,

*not* $0.50. (Calculation left as an exercise for the reader :)

Suppose instead that the casino used the following strategy: if a 1 or

6 is rolled, declare the other (6 or 1) as the number not rolled; if a

2, 3, 4 or 5 is rolled, declare any other 2-3-4-5 as the number not

rolled. Then your strategy loses every time, and your equity is -$1.

I think Brian is correct: if you are not allowed to make any

assumption about the casino's stratey, how can you improve your equity

above the original $0?

Back to the subject of -2 vs -2 matches:

> NOW, I explain why I made this analogy in the first place. Brian

> proposed (please correct me if I'm wrong) that if a player tried to get

> around the "double at first opportunity when playing a match which

> reaches the -2,-2 score" by waiting until his/her lead was close the

> drop take line, the opponent should always take. That way the opponent

> would be exactly to where he/she would have been if the offerer of the

> double had actually doubled at first opportunity (and the receiver of the

> double took then).

>

> My contention is that the person has a decision to make as to whether

> or not to take this cube (near the drop/take line) and s/he should try and

> TAKE ADVANTAGE OF ALL OF THE INFORMATION PRESENTED. Decide whether the

> position offers him/her better than a 30% chance in this game. If better,

> then take, otherwise pass. This seems better than just blindly accepting.

Yes, it is obvious that the player being doubled can do better than

just blindly accepting. I'll paraphrase what I understood as Brian's

original point: We agree that this sub-optimal strategy on the part of

the player being doubled effectively restores the situation to what it

would have been if the doubler had doubled at first chance instead of

waiting. So if the player being doubled does not just accept blindly,

but makes an optimal decision, then the doubler is no better off (but

could well be worse off) than if he had doubled at first chance. (Of

course, if the player being doubled employs a *different* sub-optimal

strategy and passes a take, then the doubler could have gained by

waiting.)

It seems to me this argument is essentially just spelling out the

accepted wisdom that losing your market is a Bad Thing. It does not

deal with the other side of the coin, that doubling early and then

seeing the game turn around is also a Bad Thing. What's special about

the match score -2 vs -2 is to do with the reduced effect of the

latter Bad Thing, because the cube is dead. But like Chuck, I fail to

be convinced by any of the "proofs" we have seen so far that doubling

immediately is never wrong.

Would someone care to state the exact conditions and the exact

"theorem" that is being proposed? I seem to remember seeing an

article (perhaps by Walter Trice?) in an early issue of Leading Edge

treating this question with some mathematical rigour.

-- Ole Hoegh Jensen (hoegh on fibs)

Dec 17, 1996, 3:00:00â€¯AM12/17/96

to

<594bnh$5...@dismay.ucs.indiana.edu>...

> Secondly, I don't undertand what you (Brian) mean when you say:

> "If you make no assumption about the distribution then you cannot

> do better than to achieve your original equity". My feeling is that you

> should ACCEPT the casino offer and then change your bet as follows:

>

> Case 1: You originally chose 1-6. Do the following:

>

> number casino says DIDN'T come up: your action:

>

> 1 change to 2-3-4-5

> 2 stay with 1-6

> 3 stay with 1-6

> 4 stay with 1-6

> 5 stay with 1-6

> 6 change to 2-3-4-5

>

> Case 2: you originally chose 2-3-4-5. Do the following:

>

> number casino says DIDN'T come up: your action:

>

> 1 stay with 2-3-4-5

> 2 change to 1-6

> 3 change to 1-6

> 4 change to 1-6

> 5 change to 1-6

> 6 stay with 2-3-4-5

>

> In either case, your expected return is $1.50 so your equity is +$0.50.

> (I agree with you that your equity is 0 if you aren't offered the

> "Christmastime bonus".)

> Secondly, I don't undertand what you (Brian) mean when you say:

> "If you make no assumption about the distribution then you cannot

> do better than to achieve your original equity". My feeling is that you

> should ACCEPT the casino offer and then change your bet as follows:

>

> Case 1: You originally chose 1-6. Do the following:

>

> number casino says DIDN'T come up: your action:

>

> 1 change to 2-3-4-5

> 2 stay with 1-6

> 3 stay with 1-6

> 4 stay with 1-6

> 5 stay with 1-6

> 6 change to 2-3-4-5

>

> Case 2: you originally chose 2-3-4-5. Do the following:

>

> number casino says DIDN'T come up: your action:

>

> 1 stay with 2-3-4-5

> 2 change to 1-6

> 3 change to 1-6

> 4 change to 1-6

> 5 change to 1-6

> 6 stay with 2-3-4-5

>

> In either case, your expected return is $1.50 so your equity is +$0.50.

> (I agree with you that your equity is 0 if you aren't offered the

> "Christmastime bonus".)

Not if I am running the casino! Let's suppose the casino follows

this strategy:

Number that came up Number the casino tells you

1 6

2 5

3 4

4 3

5 2

6 1

Please verify that if the casino follows this strategy, then your

strategy loses on every single occasion. (Of course, if I were really

running the casino, then I would let you win sometimes :-))

This is why I said, "If you make no assumption about the distribution

then you cannot do better than to achieve your original equity".

> NOW, I explain why I made this analogy in the first place. Brian

> proposed (please correct me if I'm wrong) that if a player tried to get

> around the "double at first opportunity when playing a match which

> reaches the -2,-2 score" by waiting until his/her lead was close the

> drop take line, the opponent should always take. That way the opponent

> would be exactly to where he/she would have been if the offerer of the

> double had actually doubled at first opportunity (and the receiver of the

> double took then).

I have proven that if the opponent simply takes every double, regardless

of how lopsided the position, then his equity is equal to what it would

have been had you doubled immediately.

> My contention is that the person has a decision to make as to

whether

> or not to take this cube (near the drop/take line) and s/he should try

and

> TAKE ADVANTAGE OF ALL OF THE INFORMATION PRESENTED. Decide whether the

> position offers him/her better than a 30% chance in this game. If

better,

> then take, otherwise pass. This seems better than just blindly

accepting.

I agree 100%. It must be better than blindly accepting. And since it

is your opponent that has the option, then you must have made a mistake

by not doubling earlier, right?

Brian

Dec 17, 1996, 3:00:00â€¯AM12/17/96

to

In article <yvtpw09...@iris.cl.cam.ac.uk>,

Ole Jensen <oh...@cl.cam.ac.uk> wrote:

(snip)

>I think Brian is correct: if you are not allowed to make any

>assumption about the casino's stratey, how can you improve your equity

>above the original $0?

>

(snip)

OOPS!!! Thanks to your's and Brian's messages, I now realize an error

I have made in my proposed "casino game". In my mind, I was thinking

that the casino (in the Christmastime bonus option) is telling the

player a number which is on the SIDE of the die (i.e. Neither up NOR

down). Then I think my strategy of changing your choice is correct

(as is the equity; please check me on that).

As far as the problem which I actually STATED: If they pick a number

at random, then effectively you are playing with a five sided die.

If they throw out 1 or 6, your equity is:

you choose 1-6 1/5 X $3 = $0.60 return (equity -$0.40)

you choose 2-3-4-5 4/5 X $1.50 = $1.20 return (equity +$0.20)

So go with 2-3-4-5 after the bonus. If they throw out a 2,3,4, or 5

then things look like:

you choose 1-6 2/5 X $3 = $1.20 return (equity +$0.20)

you choose 2-3-4-5 3/5 X $1.50 = $0.90 return (equity -$0.10)

In this case go with 1-6 after the bonus. Either way you net +$0.20.

Of course the casino could be nasty (what, a casino?? surely not!)

and figure that some players will choose the above strategy. Then

they could foil your plan and tell you "1" when it is actually 6

(or vise vera). Likewise they could tell you that a 2,3,4,or 5 is

NOT up when one of the others is up. Then you switch to 1-6 and

lose. In these cases your equity is -$1 (you lose every time).

In that case your best strategy is to never change (as I believe Brian

recommended) and just get your net 0 equity. But, hey, I said

the casino was feeling "generous" and it was "Christmastime". Would

they do such a dastardly thing at Christmas?

>Would someone care to state the exact conditions and the exact

>"theorem" that is being proposed? I seem to remember seeing an

>article (perhaps by Walter Trice?) in an early issue of Leading Edge

>treating this question with some mathematical rigour.

>

>-- Ole Hoegh Jensen (hoegh on fibs)

Kit did this in a message this weekend. I am responding to that

message shortly. You can go find Kit's post or wait for my reply

(which will have his argument in it).

Dec 17, 1996, 3:00:00â€¯AM12/17/96

to

In article <kwoolseyE...@netcom.com>,

Kit Woolsey <kwoo...@netcom.com> wrote:

>

>Let's try it this way:

>

>Suppose you and I are playing a match, and we reach -2, -2. My strategy

>is as follows:

>

>When it is my turn to roll, I will examine all 1296 possible outcomes of

>the next (I roll, you roll) sequence. If there is as much as one

>possible outcome which would cause me to lose my market, I will double.

>If there are no such outcomes, I will not double.

>

>Now, suppose you use ANY strategy which is more conservative in doubling

>than my strategy. Then, I claim that you are at a disadvantage. My

>strategy guarantees that it is impossible for me to lose my market. Your

>more conservative strategy allows the possibility of you losing your

>market. Therefore, the only possible outcomes of the game are:

>

>1) The cube is turned, accepted, and the game is played for the match.

>2) You double, and I correctly pass (when you have lost your market).

>

>If 2) happens, you wish you had doubled earlier. Otherwise, there is no

>difference. Consequently, doubling immediately must be superior to a

>strategy which risks losing ones market.

>

>Kit

Now we are getting somewhere. I believe what Kit has shown is

that IF one player follows the "never lose your market" strategy, the

OTHER player can never do better. However, it looks like if one

player does NOT choose the "never lose your market" strategy, his/her

oppoenet may be able to find a better strategy than "never lose your

market". Is the following an example of such? (Let's assume that

the players are of "equal" skill in checker play ability.)

Assume player B decides (erroneously?) that s/he should not

double until a 70% favorite, AND that s/he should drop with 45%

or less game winning chances. Player A could choose the "never

lose your market" strategy OR could try the following strategy--

wait until a 55% favorite (or higher) and then double player B out.

It looks to me like this strategy will beat the "never lose your

market" strategy in the long run, even though occasionally player A

will go from less than 55% to more than 70% in a single roll.

Does anyone see a flaw in my argument? (Ouch, don't blast the

internet all at once with your flames!)

Dec 17, 1996, 3:00:00â€¯AM12/17/96

to

bo...@bigbang.astro.indiana.edu (Chuck Bower) writes:

> In article <yvtpw09...@iris.cl.cam.ac.uk>,

> Ole Jensen <oh...@cl.cam.ac.uk> wrote:

>

> >Would someone care to state the exact conditions and the exact

> >"theorem" that is being proposed? I seem to remember seeing an

> >article (perhaps by Walter Trice?) in an early issue of Leading Edge

> >treating this question with some mathematical rigour.

>

> Kit did this in a message this weekend.

Here's what Kit said:

: When it is my turn to roll, I will examine all 1296 possible

: outcomes of the next (I roll, you roll) sequence. If there is as

: much as one possible outcome which would cause me to lose my market,

: I will double. If there are no such outcomes, I will not double.

:

: Now, suppose you use ANY strategy which is more conservative in

: doubling than my strategy. Then, I claim that you are at a

: disadvantage.

This cannot be the whole story. Suppose you are on roll with your

last four checkers on your ace-point, and your opponent has his last

two checkers on his ace-point. Then you have six market-losing rolls,

and so according to Kit's strategy you should double.

It seems the correct strategy needs to take into account whether after

any of your own 36 rolls the opponent will have a cash. If that is

the case there is something to be gained from not doubling, and this

at least should be weighed against the possible gains from doubling.

I believe the theorem I was asking for did have a premise taking this

problem into account.

Dec 18, 1996, 3:00:00â€¯AM12/18/96

to

Ole Jensen (oh...@cl.cam.ac.uk) wrote:

: bo...@bigbang.astro.indiana.edu (Chuck Bower) writes:

I didn't claim that my proposed strategy is the "best" strategy.

Obviously in your above example it would be wrong to double. Note,

however, that to have gotten to this position in the first place somebody

had to have made a previous error of not doubling when they should. All

I said was that if I adopt this strategy and you adopt a more

conservative strategy then I will have the best of it.

Kit

Dec 18, 1996, 3:00:00â€¯AM12/18/96

to

Chuck Bower (bo...@bigbang.astro.indiana.edu) wrote:

: In article <kwoolseyE...@netcom.com>,

: Kit Woolsey <kwoo...@netcom.com> wrote:

: >

: >Let's try it this way:

: >

: >Suppose you and I are playing a match, and we reach -2, -2. My strategy

: >is as follows:

: >

: >When it is my turn to roll, I will examine all 1296 possible outcomes of

: >the next (I roll, you roll) sequence. If there is as much as one

: >possible outcome which would cause me to lose my market, I will double.

: >If there are no such outcomes, I will not double.

: >

: >Now, suppose you use ANY strategy which is more conservative in doubling

: >than my strategy. Then, I claim that you are at a disadvantage. My

Nothing wrong with your argument. As a practical matter I don't always

practice what I preach unless I know for a fact that my opponent also

understands the concepts in question. For example, suppose my opponent

gets the opening roll, rolls 2-1, and plays 13/11, 6/5. I suppose that if

I roll 4-4 and he rolls 6-6 I would lose my market (not even sure about

that). However I still don't double. The cost of not doubling now is

insignificant. However the gain from not doubling (if my opponent doesn't

understand the concept) is large. Let's say I roll a weak response such

as 6-2. Now it is very clear for him to double -- he has a few pretty

hefty market-losing sequences. If he doesn't do so, I have gained. Also,

if he fails to double I will now know that he doesn't understand the

concept. In that case I may be willing to take small chances of losing my

market later in the game, secure in the knowledge that my opponent may

make even larger errors by not doubling when he has market losing

sequences.

Kit

Dec 18, 1996, 3:00:00â€¯AM12/18/96

to

kwoo...@netcom.com (Kit Woolsey) writes:

> Ole Jensen (oh...@cl.cam.ac.uk) wrote:

>

> : Here's what Kit said:

>

> : : When it is my turn to roll, I will examine all 1296 possible

> : : outcomes of the next (I roll, you roll) sequence. If there is as

> : : much as one possible outcome which would cause me to lose my market,

> : : I will double. If there are no such outcomes, I will not double.

> : :

> : : Now, suppose you use ANY strategy which is more conservative in

> : : doubling than my strategy. Then, I claim that you are at a

> : : disadvantage.

>

> I didn't claim that my proposed strategy is the "best" strategy.

> Obviously in your above example it would be wrong to double. Note,

> however, that to have gotten to this position in the first place somebody

> had to have made a previous error of not doubling when they should. All

> I said was that if I adopt this strategy and you adopt a more

> conservative strategy then I will have the best of it.

I didn't mean to attribute any false claims to you, Kit, but merely to

point out why I don't think your argument constitutes a proof, in any

mathematical sense, that some particular doubling strategy is optimal

at -2 vs -2. I don't think anyone in this thread has attempted to

state a precise enough definition of that strategy for it to even make

sense to discuss mathematical proof.

The comments that you and others have made on *practical* play at this

match score are valid and useful, and the question of mathematical

proof of some theorem (that assumes perfect play on both sides) may be

only marginally relevant from a practical point of view. I still find

that question interesting, though -- hence my request for the theorem

I thought I remembered someone proving in Leading Edge.

-- Ole (hoegh on fibs)

Dec 18, 1996, 3:00:00â€¯AM12/18/96

to

Kit Woolsey <kwoo...@netcom.com> wrote in article

<kwoolseyE...@netcom.com>...

> Nothing wrong with your argument. As a practical matter I don't always

> practice what I preach unless I know for a fact that my opponent also

> understands the concepts in question. For example, suppose my opponent

> gets the opening roll, rolls 2-1, and plays 13/11, 6/5. I suppose that

if

> I roll 4-4 and he rolls 6-6 I would lose my market (not even sure about

> that). However I still don't double. The cost of not doubling now is

> insignificant. However the gain from not doubling (if my opponent

doesn't

> understand the concept) is large. Let's say I roll a weak response such

> as 6-2. Now it is very clear for him to double -- he has a few pretty

> hefty market-losing sequences. If he doesn't do so, I have gained.

Also,

> if he fails to double I will now know that he doesn't understand the

> concept. In that case I may be willing to take small chances of losing

my

> market later in the game, secure in the knowledge that my opponent may

> make even larger errors by not doubling when he has market losing

> sequences.

I can agree to this practical treatment of the situation (doubling is right

in theory, but if I am an underdog then test the opponent).

Now the problem is this: how do I teach this to my computer?

Brian

Dec 24, 1996, 3:00:00â€¯AM12/24/96

to

In article <kwoolseyE...@netcom.com> kwoo...@netcom.com (Kit Woolsey) writes:

>Chuck Bower (bo...@bigbang.astro.indiana.edu) wrote:

>

>: Would someone PLEASE explain how this is "proof" that it is correct

>: to double immediately at -2, -2. PLEASE, PLEASE, PLEASE!!

>

>

>Let's try it this way:

>

>Suppose you and I are playing a match, and we reach -2, -2. My strategy

>is as follows:

>

>

>Suppose you and I are playing a match, and we reach -2, -2. My strategy

>is as follows:

>

>When it is my turn to roll, I will examine all 1296 possible outcomes of

>the next (I roll, you roll) sequence. If there is as much as one

>possible outcome which would cause me to lose my market, I will double.

>If there are no such outcomes, I will not double.

>

>Now, suppose you use ANY strategy which is more conservative in doubling

>the next (I roll, you roll) sequence. If there is as much as one

>possible outcome which would cause me to lose my market, I will double.

>If there are no such outcomes, I will not double.

>

>Now, suppose you use ANY strategy which is more conservative in doubling

>than my strategy. Then, I claim that you are at a disadvantage. My

>strategy guarantees that it is impossible for me to lose my market. Your

>more conservative strategy allows the possibility of you losing your

>market. Therefore, the only possible outcomes of the game are:

>

>1) The cube is turned, accepted, and the game is played for the match.

>2) You double, and I correctly pass (when you have lost your market).

>

>If 2) happens, you wish you had doubled earlier. Otherwise, there is no

>difference. Consequently, doubling immediately must be superior to a

>strategy which risks losing ones market.

>

>Kit

>strategy guarantees that it is impossible for me to lose my market. Your

>more conservative strategy allows the possibility of you losing your

>market. Therefore, the only possible outcomes of the game are:

>

>1) The cube is turned, accepted, and the game is played for the match.

>2) You double, and I correctly pass (when you have lost your market).

>

>If 2) happens, you wish you had doubled earlier. Otherwise, there is no

>difference. Consequently, doubling immediately must be superior to a

>strategy which risks losing ones market.

>

>Kit

The last time I remember being in this discussion, I thought some of us

discussed a modified strategy that reduces to this against a perfect

opponent, and does slightly better against an opponent that makes

mistakes:

Look to see if you have any market losers among the 1296 possible 2-ply

outcomes. Also look to see if you have any rolls such that if you roll

them and your opponent doubles, you'll have a correct drop. (NOTE:

there will be no such rolls if your opponent was using either Kit's

strategy above or this strategy. These sequences will only exist

against an opponent that has already made a mistake!)

If the lost equity from not doubling on the market losing sequences

outweighs the lost equity from doubling on the sequences in which

otherwise you'd drop, then double.

Kit's strategy is optimal against a perfect opponent, and the proofs of

"double if you have any market losers" all assume a perfect opponent.

But not all opponents are perfect.

Basically, by using this strategy, we allow an imperfect opponent

slightly more opportunities to reach outcome "2)" above and lose equity

by losing their market. We get these outcomes at the expense of risking

losing our own market, but we're guaranteeing that our market losing

equity loss is less than the equity loss our opponent has by losing

their market (over a long enough sample size of course).

-michael j zehr

Reply all

Reply to author

Forward

0 new messages