Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Morton's Theorem provided by Morton & Discussed by Jalib & Caro

69 views
Skip to first unread message

Ken Churilla

unread,
Oct 19, 1998, 3:00:00 AM10/19/98
to
Several people have asked what Morton's Theorem is.
The following post was made by Abdul Jalib about it. This
post is followed by one from Mike Caro who also reviews
the theorem. I saved both post because they were pretty
insightful by a couple of recognized experts. I planned on
putting them on my website, but hadn't gotten around to it.

On 20 Jul 1998 14:53:12 -0700, in rec.gambling.poker Abdul Jalib
<AbdulJ_...@PosEV.com> wrote:

Andy was a close friend and I'll miss him dearly.

Before I knew Andy well, I let him start renting a room from me. I gave
him what I felt was a significantly undermarket price, as I knew he was
brilliant, and felt he would more than repay me in terms of poker
knowledge. I considered him still a bit green in terms of experience, but
his drive to discover more about poker strategy through logical and
mathematical analyses was pure gold in my eyes.

It wasn't long before I was repaid with what has become known as Morton's
Theorem. Andy, JP, a couple of others, and I had met to discuss poker
strategy. Andy was trying to make the case that you should let all the
fishy callers call along, rather than driving them out, since they are
making mistakes by the fundamental theorem of poker. Although I couldn't
do much more than wave my hands frantically, I replied that the
fundamental theorem of poker usually doesn't apply in multiway pots, that
you usually want your opponents to fold correctly from their perspective
when you hold something like top pair top kicker versus a multiway field.
(It would be safer to say "often" rather than "usually".)

The next day, a table in my apartment sprouted papers with
mathematical scribbles, with Andy working furiously in the midst of them.
After a while, he came to me and gave me guidance to do his calculations
from scratch, for a double check. We worked on this and discussed it for
some hours, and then Andy generously typed it up for rec.gambling.poker.
His article follows at the end of this article...

Of course, while profit was my initial motivation for letting Andy move in
for cheap, we quickly became good friends. We discussed philosophy over
beers. I turned him onto the TV show "Xena", which became an obsession to
him. (Later, one of his exgirlfriends from college then appeared on Xena's
sister show Hercules as a half woman half horse, much to his delight, and
in real life she is Kevin "Hercules" Sorbo's girlfriend.) He brought home
a shiny new motorcycle from poker winnings and proudly showed it to me; if
only he hadn't been such a good poker player or the cards had fallen
differently or almost anything had been different...

Andy had a PhD in Chemistry, but went from grad school to the cardrooms.
After he had been playing professionally for a couple of years, he applied
for a job at a chemical research firm, got offered the job, and then got
cold feet about the 9 to 5 thing and refused the offer! The company then
offered him a position as a consultant, which he accepted. The company
obviously felt it would be more than repaid by Andy's brilliance.

"Morton's Theorem" follows... (BTW, Sklansky says he understood
this when he wrote _Theory of Poker_, but had simplified things for the
average Joe.)

Subject: Going Too Far & Implicit Collusion
Date: 03 Apr 1997 00:00:00 GMT
From: Andy Morton <and...@ix.netcom.com>
Organization: Netcom
Newsgroups: rec.gambling.poker


Implicit Collusion and Going too Far


I usually enjoy reading Mike Caro's Card Player column. One from last June
made a big impression on me. In it he says:

_The real low-limit secret for today_. The most important thing
i can teach you about playing the lower limits is that you
usually should *not* raise from early positions, no matter what
you have... because all of those theories of thinning the field
and driving out opponents who might draw out on you don't hold
true in these smaller games [where] you're usually surrounded
by players who often call with nearly hopeless hands.... Which
is better, playing against a few strong and semistrong players
with possibly a small advantage for double stakes, or playing
against a whole herd of players, mostly weak, for single stakes?
Clearly, when you're not likely to win the pot outright by
chasing everyone out, you want to play against weak opponents,
and the more the merrier. So, why raise? There, I've just
described one of the costliest mistakes in low-limit poker. The
mistake is raising when many potential callers remain behind
you, thus chasing away your profit. Don't do that.

Until recently, this made a lot of sense to me. After all, the
Fundamental Theorem of Poker states (roughly) that when your opponents
make mistakes, you gain, and when they play correctly, you lose. In
holdem, if all of those calling stations in the low-limit games want to
chase me with their 5 out draws to make trips or 2 pair when I flop top
pair best kicker, and they don't have the pot odds to correctly do so,
that sounds like a good situation for me.

Yet, it seems like these players are drawing out so often that something
must be wrong. Hang around the mid-limits, holdem or stud, for any length
of time and you're sure to hear players complain that the lower limit
games can't be beat. You can't fight the huge number of callers, they
say. You can't protect your hand once the pot has grown so big, they say.

At first, I thought these players were wrong. They just don't understand
the increased variance of playing in such situations, I told myself. In
one sense, these players are right, of course. The large number of
calling stations combined with a raise or two early in a hand make the
pots in these games very large relative to the bet size. This has the
effect of reducing the magnitude of the errors made by each individual
caller at each individual decision. Heck, the pot might get so big from
all that calling that the callers _ought_ to chase. For lack of a better
term, I call this behavior on the fishes' part _schooling_. Still,
tight-aggressive players are on average wading into these pots with better
than average hands, and in holdem when they flop top pair best kicker, for
example, they should be taking the best of it against each of these
long-shot draws (like second pair random kicker). In holdem, the schooling
phenomenon increases the variance of the player who flops top pair holding
AK, but probably also _increases_ his expectation in the long run, I
thought, relative to a game where these players are correctly folding
their weak draws.

Thinking this way, I was delighted to follow Caro's advice, and not try to
run players with weak draws out of the pots where I thought I held the
best hand on the flop or turn. This is contrary to a lot of advice from
other poker strategists, as Caro points out, and I found myself
(successfully, I think) trying to convince some of my poker playing
buddies of Caro's point of view in a discussion last week.

Well, some more thinking, rereading some old r.g.p. posts (thank you,
dejanews), a long discussion with Abdul Jalib, and a little algebra have
changed my mind: I think Caro's advice is dead wrong (at least in many
situations) and I think I can convince you of this, if you'll follow me
for a bit longer.

What I'm going to tell you is that if you bet the best hand with more
cards to come against two or more opponents, you will often make more
money if some of them fold, *even if they are folding correctly, and would
be making a mistake to call your bet.* Put another way, *you want your
opponents to fold correctly, because their mistaken chasing you will cost
you money in the long run.* I found this result very surprising to say
the least. I've never seen it described correctly in any book or article,
although at least a few posts to this newsgroup have concerned closely
related topics.

I'm no poker authority but I think this concept has got to lead to changes
in strategy in situations where players are chasing too much (and yes,
Virginia, this happens not only in the 3-6 games, but also in the higher
limits from time to time. Curiously, I have several friends who play very
well who often complain that they can't beat 20-40 games when they get
loose like this, or at least don't do as well in these games as they do in
tighter games. hmmm....). Let's look at a specific example.


Suppose in holdem you hold AdKc and the flop is Ks9h3h, giving you top
pair best kicker. When the betting on the flop is complete you have two
opponents remaining, one of whom you know has the nut flush draw (say
AhTh, giving him 9 outs) and one of whom you believe holds second pair
random kicker (say Qc9c, 4 outs), leaving you with all the remaining cards
in the deck as your outs. The turn card is an apparent blank (say the 6d)
and we1ll say the pot size at that point is P, expressed in big bets.

When you bet the turn player A, holding the flush draw, is sure to call
and is almost certainly getting the correct pot odds to call your bet.
Once player A calls, player B must decide whether to call or fold. To
figure out which action player B should choose, calculate his expectation
in each case. This depends on the number of cards among the remaining 46
that will give him the best hand, and the size of the pot when he is
deciding:

E(player B|folding) = 0

E(player B|calling) = 4/46 * (P+2) - 42/46 * (1)

Player B doesn't win or lose anything by folding. When calling, he wins
the pot 4/46 of the time, and loses one big bet the remainder of the time.
Setting these two expectations equal to each other and solving for P lets
us determine the potsize at which he is indifferent to calling or folding:


E(player B|folding) = E(player B|calling) => P'_B = 8.5 Big bets

When the pot is larger than this, player B should chase you; otherwise,
it's in B's best interest to fold. This calculation is familiar to many
rec.gamblers, of course.

To figure out which action on player B's part _you_ would prefer,
calculate your expectation the same way:

E(you|B folds) = 37/46 * (P+2)

E(you|B calls) = 33/46 * (P+3)

Your expectation depends in each case on the size of the pot (ie, the pot
odds B is getting when considering his call Setting these two equal lets
us calculate the potsize P where you are indifferent whether B calls or
folds:

E(you|B calls) = E(you|B folds) => P'_you = 6.25 Big bets.

When the pot is smaller than this, you profit when player B is chasing,
but when the pot is larger than this, your expectation is higher when B
folds instead of chasing.

This is very surprising. There's a range of pot sizes (in this case
between 8.5 and 6.25 big bets when the turn card falls) where it's correct
for B to fold, and you make more money when he does so than when he
incorrectly chases. You can see this graphically below

|
B SHOULD FOLD | B SHOULD CALL
|
v
|
YOU WANT B TO CALL| YOU WANT B TO FOLD
|
v
+---+---+---+---+---+---+---+---+---+---> POT SIZE, P, in big bets
0 1 2 3 4 5 6 7 8 9
XXXXXXXXXX
^
PARADOXICAL REGION

The range of pot sizes marked with the X's is where you want your opponent
to fold correctly, because you lose expectation when he calls incorrectly.


This is an apparent violation of the Fundamental Theorem of Poker, which
results from the fact that the pot is not heads up but multiway. (While
Sklansky states in Theory of Poker that the FToP does not apply in certain
multiway situations, it would probably be better to say that it in general
does not apply to multiway situations.) In essence what is happening is
that by calling when P is in this middle region, player B is paying too
high a price for his weak draw (he will win the pot too infrequently to
pay for all his calls trying to suck out), but you are no longer the sole
benefactor of that high price -- player A is now taking B's money those
times that A makes his flush draw. Compared to the case where you are
heads up with player B, you still stand the risk of losing the whole pot,
but are no longer getting 100% of the compensation from B's loose calls.

These sorts of situations come up all the time in Hold'em, both on the
flop and on the turn. It1s the existence of this middle region of pot
sizes, where you want at least some of your opponents to fold correctly,
that explains the standard poker strategy of thinning the field as much as
possible when you think you hold the best hand. Even players with
incorrect draws cost you money when they call your bets, because part of
their calls end up in the stacks of other players drawing against you.
This is why Caro's advice now seems wrong to me, in general. Those weak
calling stations are costing you money when they make the mistake of
calling too much. In practice, when you flop a best but vulnerable hand,
the pot size is rarely smaller than this middle region, where you actually
want your opponents to call. Normally, the pot size is such that you want
them to fold even if they would be wise to do so. In loose games, the pot
size will often be at the high side of the scale, where you would love for
them to fold, but they have odds to call and their fishy calls become
correct.

This brings up another interesting point. In our three-handed example,
both you and player B are losing money when B chases you incorrectly (both
your and his expectations would be higher if he folded). This implies
that player A is benefitting from his call, since poker is a zero-sum game
(neglecting rake, etc). In fact, player A is benefitting _more_ from B's
call than the magnitude of B's mistake in calling (since you are also
losing expectation due to B's call).

Because you are losing expectation from B's call, it follows that the
_aggregate_ of all other players (ie, A and B) must be gaining from B's
call. In other words, if A and B were to meet in the parking lot after
the game and split their profits, they would have been colluding against
you.

I don't really know Roy Hashimoto or Lee Jones, but I suspect that this
situation might be what Roy had in mind when he first described what he
calls "implicit collusion" in games where there are many calling stations:
one fish makes a play which reduces his overall expectation and all fish
benefit by more than the magnitude of the first fish's mistake. That's
collusion, just as if a player reraises with the worst hand to trap a
third player for more bets when the first player's buddy has the nuts. Of
course no one realizes there's collusion going on in these situations, so
the collusion is implicit. (I'd sure like to hear from Roy or Lee on this
point, because I think there's a significant difference between what I've
called 'schooling' and what I've called 'implicit collusion', and that the
two concepts are often confused with each other, but I'd hate to further
confuse the issue by misappropriating someone else's label for this
phenomenon.)

There was an interesting thread on this group last year started by Mason
Malmuth called 'Going Too Far,' about the appropriate strategy changes in
a game where many players are calling too loosely not only before the flop
but also on the later streets. I suspect that the phenomenon described
here (where both the leader and the chasers are giving up expectation to
the player who is drawing to a very strong hand) lies behind the correct
response to his discussion in that thread. One strategy change he
mentions is that you'd like your starting hand to be suited in games like
these. In light of what I've presented here I can not only understand
this strategy change, but can see others as well. If this has made sense
to anyone who can think of other strategy changes resulting from these
ideas, let's hear them.

Finally, having criticized something by one of the famous poker authors,
Abdul is encouraging me to go for broke <g>: It seems pretty clear that
Sklansky also missed this idea, at least when he was writing Winning
Poker, the precursor to Theory of Poker. First, he mentions that the
Fundamental Theorem applies to all two-way and nearly all multiway pots.
While I haven't proven it, it seems likely that nearly all multiway pots
will contain some sort of region of implied collusion where the leader
would prefer that players fold correctly, ie where the Fundamental Theorem
breaks down. Later,

in the chapter "Win the Big Pots Right Away," Sklansky makes his ignorance
of

this concept explicit. Discussing a multiway seven stud hand in which
your hand is almost certainly best on fourth street he writes:

You must ask yourself whether an opponent would be correct to
take [the odds you are giving him] knowing what you had. If so,
you would rather have that opponent fold. If not -- that is if
the odds against your opponent1s making a winning hand are
greater than the pot odds he1s getting -- then you would rather
have him call. In this case, instead of winning the pot right
away, you1re willing to take the tiny risk that your opponent
will outdraw you and try to win at least one more bet. ...you
would not want to put in a raise to drive people out. (p. 62)

Slowplaying is certainly correct in some cases, but your 'druthers' in a
multiway pot can never be decided so simply as by asking whether each of
your individual opponents has the right pot odds to chase you.

[End quoted article from Andy Morton. Goodbye Andy.]

====== Mike Caro's post ==========================


On Tue, 21 Jul 1998 00:43:58 GMT, in rec.gambling.poker ca...@caro.com
(Mike Caro) wrote:

Abdul --

I don't know whether I ever had the pleasure of meeting Andy Morton
personally, although I'm told he sometimes played at Hollywood Park
while I was there.

He did, however, e-mail me (a copy appears at the end of this post.)
He said that he agreed with me about 90 percent of the time. Your post
must be in reference to the other 10 percent. :-)

I, too, am saddened by his death, and the passages you quoted show
much brilliance that -- if expanded over the years -- would have been
welcome by the ever-evolving family of poker analysts.It is simply one of
the most interesting and unexpectedly thoughtful pieces of poker thinking
I've read recently. Thank you for sharing it.

I will have to study the words more closely, but he says that he did
agree with me at first, then, later, he didn't. One way or the other, he
was bound to be right, and in this case, I believe it was the former.

There is nothing wrong with his argument. There are, of course, ways
that a player can do the unprofitable thing and the benefit of this is
directed toward a player other than yourself. I point this out in
discussing draw poker (which is easy to understand). If you hold a pair of
kings and four opponents are drawing to flushes of different suits, then
the PRIMARY beneficiary (in rare cases the ONLY beneficiary) of the extra
players is the person drawing to the BEST flush. The rest SHOULD BE
getting good pot odds, but they aren't, if you KNOW what the other hands
are. Still, their problem isn't with YOU and your pair of kings; it's with
simultaneously made flushes. If the weakest flush connects, YOU don't care
how many of the others make a flush (unless you make a "miracle" full
house or better, in which case you hope they ALL connect). But the weakest
flush, having connected, DOES care.

The same is true of two small pair. If a lot of players are drawing to
beat the hand, it's a favorite against EACH opponent individually, but is
a loser against ALL of them collectively.

In part, I've made the same point that Andy made in what will seem to be a
dissimilar way. I may have even made it on this forum years ago. Suppose
there were no draw, just five-card poker, where you play what you get, and
almost everyone in the world is dealt in from an infinite deck. There are
over five billion contestants. You are dealt K-Q-J-10-9, all spades. You
are more than a 600,000 to 1 favorite not to lose (win or tie) against any
opponent you could randomly choose. And still you'd need to throw that
nine away and try for a royal flush, if you could draw. But unfortunately,
you're stuck with the hand -- and the loss -- because there IS no draw. As
I'm sure you understand, the pot goes to just one player, and that means
the odds of your hand holding up become disproportionately greater with
the addition of each opponent. So, if I told you in advance that you would
be dealt a king-high flush and I let you choose the ideal number of
opponents, that number can be calculated for maximum profit.

There is possibly a simple way to resolve the larger argument about
whether you want many weak callers, but not to everyone's
satisfaction. And I'll leave it to others to actually do it. Start
working down the list of best hold 'em hands: A-A, K-K, Q-Q, J-J, A-K
suited, and so forth. Go as far as you want. You'll need to write a simple
match-up program to do this. Create a computer simulation to run each hand
against nine random opponents. And, then, for each hand, examine the nine
opponents and toss out the weakest five. Then, for each random match-up,
also run the selected target hand against just the four remaining
opponents.

Then measure in which case the target hand did better, based on the
actual share of the pots won versus the "fair share." You'll have to
select your own method, but I'm open to anything reasonable.
Naturally, there is bias against the four-opponent trial, because
there is presumably some dead money to be considered, and with four
opponents, it is "divided" fewer ways. Also, this doesn't adequately
take into consideration the fact that some players will pass along the
way, and that they may pass either correctly or incorrectly at THOSE
stages, too. We're just trying to keep it simple to see if there's an
obvious truth. We can make reasonable adjustments now, and later (if
necessary). And we can set up elaborate simulations that actually play
through the hands (although that will lead to elaborate and understandable
arguments about how the chosen strategy skewed the results).

Here's what I think one will eventually discover by pursuing this
analysis to its conclusion (assuming that's possible):.

1. There are specific situations in which you'd rather have fewer than
many opponents.

2. There are some situations in which there is a "perfect" number of
opponents. More hurts profit; fewer hurts profit. (I have written and
spoken at length about this, so I don't wish to be pinned TOO firmly to
the quote Andy cites from my column. I use different pieces of poker
theory to illustrate different points at different times.)

3. In MOST situations, you will increase your profit if your field of
opponents is larger, holding weaker average hands, than you will if your
field of opponents is smaller, holding stronger average hands.

4. The money has to go SOMEWHERE. Therefore (excluding house rakes for
convenience), all money lost by poor play ends up in opposing stacks.

5. Sometimes, certain hands, especially the best speculative hands,
get a greater benefit from the money lost on poor play than do other
hands.

6. Andy's concept of "Implicit Collusion" is counterbalanced to some
degree by "Implicit Shared Profit." What I mean is that you don't
usually enjoy a perfect knowledge, when you enter a pot, about WHICH
hand is the most likely to be punished disproportionately by the weak
entrants. But you do know that the weak entrants themselves will
eventually be punished.

I could go on, but I won't. Abdul, I am very grateful that you
published Andy's post. And I am honored that he thought enough of my
research to probe deeper It is, overall, solid reasoning, and I'm not
sure -- if he and I had the chance to sit down and talk -- that we would
disagree much.

Also, I would like to acknowledge that I have read many of your posts and
often find them brilliant, as well. I appreciate your contributions to
this forum.

Straight Flushes,
Mike Caro

Text of e-mail from Andy Morton to Mike Caro, November 17, 1997 (I
don't remember the exact reason for his e-mail. I think it was in
regard to something I wrote on r.p.g. about not being satiric anymore,
because some people were misunderstanding.)

mike

i suspect you'll get a lot of mailed responses to this post

For my part, it was totally clear what you intended.

I enjoy ~90% of your wordsmithery, both on the net and in your column.

This seems like a very good percentage, to me.

I probably like slightly less than this percentage of the actual
content of your writings. Again, this is a respectable percentage,
imo, and some of the things you write seem quite important to me.
Recent examples include your two "mission" columns this year, aimed at
getting players to study their opponents more effectively.


I can understand if you choose to tone down the humor in your posts,
but don't think that just because 7 people misunderstood you that they
form any sort of representative sample.

andy morton

On 20 Jul 1998 14:53:12 -0700, Abdul Jalib <AbdulJ_...@PosEV.com>
wrote:

>Andy was a close friend and I'll miss him dearly.

[Body of the message and Andy's excellent article snipped in the
interest of brevity. Please see Abdul's previous post.]

======= End of Caro's post =================

0 new messages