Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

A Bet you can't refuse (hypothetical)

179 views
Skip to first unread message

MDuchin

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
A finance teacher once gave us this problem.
You're walking down the street when someone comes up and makes this
proposition. You can bet only the money you have on you (no checks or credit
cards) and from the flip of a fair coin, you get twice your money back if it
comes up heads and lose what you bet if it's tails. He tells you he will
continue doing this for as long as you want.
You have $100. What is your betting strategy to maximize your winnings?
Certainly a positive EV but if you bet it all, you will likely end up with
nothing after a few rounds.
I've been thinking about this lately. I believe the problem somewhat simulates
playing pot limit/no-limit poker with three players left and you believe you
have a 50% chance of winning. How much do you bet?
Interested in people's comments.
Mike

Barbara Yoon

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
MDuchin:
> ...finance teacher...gave us this problem. ...flip of a fair coin, you get

> twice your money back if it comes up heads and lose what you bet if
> it's tails. ...continue doing this for as long as you want. You have $100.

> What is your betting strategy to maximize your winnings? Certainly a
> positive EV but if you bet it all, you will likely end up with nothing after
> a few rounds. ...somewhat simulates playing pot limit/no-limit poker...

Not quite clear what you mean here by "you get twice your money back,"
but if it's 2-to-1 payoff odds, then I think the answer comes from the old
"Kelly criterion" for bankroll building -- bet a quarter of your money every time...


Petemoss

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
My understanding is that if you bet a fraction of your BR equal to
your % advantage, you will maximize the *rate of growth* of your
wealth under Kelly's criterion. But you will also have a small chance
of tapping out. So to insure that you will maximize your (total
dollar) winnings, you should bet as little as possible: one cent per
flip and hopefully, you can earn forever.


On Fri, 10 Jul 1998 00:10:59 -0400, "Barbara Yoon" <by...@erols.com>
wrote:

Regards,

Petemoss
*kindly remove the numbers for my real address*

Tom Weideman

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
MDuchin wrote:
>
> A finance teacher once gave us this problem.
> You're walking down the street when someone comes up and makes this
> proposition. You can bet only the money you have on you (no checks or credit
> cards) and from the flip of a fair coin, you get twice your money back if it
> comes up heads and lose what you bet if it's tails. He tells you he will
> continue doing this for as long as you want.
> You have $100. What is your betting strategy to maximize your winnings?

Since you can play as long as you want, your strategy will never include
quitting so long as there is more money to be won, so I'll assume what
you are really looking for is the betting strategy which affords you the
highest probability of taking ALL of this guy's money. This strategy
involves betting as little as possible (one penny, I guess) on each
round.

You may not be happy with your hourly rate of return using this
strategy, though. If this is the case, then betting according to the
Kelly criterion (or something in between this and the minimum risk
strategy) may be the answer you are looking for.

> I've been thinking about this lately. I believe the problem somewhat
> simulates playing pot limit/no-limit poker with three players left and you
> believe you have a 50% chance of winning. How much do you bet?

If by this you mean you should try to decrease your variance when you
are certain you have an edge but you have a limited bankroll, I agree...
with the same condition above that you don't reduce your action so much
that it becomes no longer worth your time. At some point you have to
ask yourself the question: "Would I rather make x dollars per hour with
a probability p of going bust, or would I rather make y dollars per hour
(y>x) with a probability q (q>p) of going bust?" This old quandry is
related to the Kelly criterion mentioned above (which quantifies one
possible "reasonable" balance of these numbers), which I think may be
addressed in the r.g.* FAQ (or at the very least can be found in many
probability texts), so I'll take it no further.


Tom Weideman

Lee Jones

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
In article <35A5BC...@dcn.davis.ca.us>,
Tom Weideman <zugz...@dcn.davis.ca.us> wrote:

[You're offered prop bet paying 2:1 on a coin flip, play as long as you
like, you have $100. What's your betting strategy?]
>>

[Physics-man, he say]


>You may not be happy with your hourly rate of return using this
>strategy, though. If this is the case, then betting according to the
>Kelly criterion (or something in between this and the minimum risk
>strategy) may be the answer you are looking for.

I think Tom has it, but just to complete the lecture (he's the prof, I'm
just a grad student): In two flips, you'll lose one bet and win two, for
a net win of one bet over two trials. So you win .5 of a bet each trial,
thus you have a 50% edge (the blackjack players in the crowd are *drooling*
at this point).

Kelly criterion (the Gambling 101 version) says you bet the percentage
of your bankroll equal to your advantage on that event, which maximizes
log(growth) of your bbankroll.

So, you bet $50. If you win, you have $200 (remember, you get paid 2:1),
and on your next bet, you bet $100. If your first bet loses, you now
bet $25 (a reverse Martingale, actually), continually halving your bet
until you win. Each time, you bet 50% of the money in your hand.

You are likely to win money at a truly astonishing rate.

Tom, Abdul, how'd I do?

Regards, Lee

--
Lee Jones | "I won't tell anybody (no) I want you to dance naked."
le...@sgi.com | -John Mellencamp
650-933-3356 |
http://reality.sgi.com/leej_engr

Bryant Butler

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Don't forget to take Damon Runyan's advice and examine the coin first!

Barry Paul

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
On 10 Jul 1998 07:38:25 GMT, le...@diver.engr.sgi.com (Lee Jones)
wrote:

>In article <35A5BC...@dcn.davis.ca.us>,
>Tom Weideman <zugz...@dcn.davis.ca.us> wrote:
>
>[You're offered prop bet paying 2:1 on a coin flip, play as long as you
> like, you have $100. What's your betting strategy?]
>>>
>
>[Physics-man, he say]
>>You may not be happy with your hourly rate of return using this
>>strategy, though. If this is the case, then betting according to the
>>Kelly criterion (or something in between this and the minimum risk
>>strategy) may be the answer you are looking for.
>
>I think Tom has it, but just to complete the lecture (he's the prof, I'm
>just a grad student): In two flips, you'll lose one bet and win two, for
>a net win of one bet over two trials. So you win .5 of a bet each trial,
>thus you have a 50% edge (the blackjack players in the crowd are *drooling*
>at this point).
>
>Kelly criterion (the Gambling 101 version) says you bet the percentage
>of your bankroll equal to your advantage on that event, which maximizes
>log(growth) of your bbankroll.
>
>So, you bet $50. If you win, you have $200 (remember, you get paid 2:1),
>and on your next bet, you bet $100. If your first bet loses, you now
>bet $25 (a reverse Martingale, actually), continually halving your bet
>until you win. Each time, you bet 50% of the money in your hand.
>
>You are likely to win money at a truly astonishing rate.
>
>Tom, Abdul, how'd I do?

This thread (or at least the part I've seen) needs a little
clarification.

If your betting units are infinitely divisible then the suggested
betting pattern is fine. You cannot go bust and will, with probability
1, eventually enjoy a streak that will make your winnings arbitrarily
large. In other words, pick any number as your cash in point and the
probability is 1 that you will reach your goal.

If however there is a lower limit on your bet, say one cent, then you
should imitate your favorite casino and bet one cent at a time.

--------------------------------------------------
Barry Paul
mailto:bp...@hoflink.com
http://hoflink.com/~bpaul
Great Neck, NY
--------------------------------------------------

Stephen H. Landrum

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Lee Jones wrote:
> Kelly criterion (the Gambling 101 version) says you bet the percentage
> of your bankroll equal to your advantage on that event, which maximizes
> log(growth) of your bbankroll.

No it does not. Betting the fraction equal to your advantage is
an approximation that is close when the advantage is small and the
the win rate is near 50/50.

> So, you bet $50. If you win, you have $200 (remember, you get paid 2:1),
> and on your next bet, you bet $100. If your first bet loses, you now
> bet $25 (a reverse Martingale, actually), continually halving your bet
> until you win. Each time, you bet 50% of the money in your hand.
>
> You are likely to win money at a truly astonishing rate.
>
> Tom, Abdul, how'd I do?

You should bet 25% of your bankroll.

When betting 50%, consider what happens when you win one and lose
one (since the chance of winning is exactly .5, this is exactly
the case you want to maximize the outcome for, since in the long
run you'll win 50% and lose 50% of your bets). Start $100, bet
$50, win, you now have $200. Bet $100, lose you now have $100.

Betting 50% will give you a rocky ride with a long run expectation
of ending up at your original amount (assuming that you can
subdivide your money infinitely). Betting more than 50% will give
you an ever diminishing bankroll. Betting less than 50% will give
you an ever increasing bankroll.

To maximize growth, you want to maximize the function
x = ((1-b)^.5) * ((1+2b)^.5) where b is the fraction of bankroll
bet, which is the multiplying effect that a single bet is going to
have on your bankroll. For positive x, (you can't bet more than
your bankroll, so x can't go negative), maximizing x*x is the same
as maximizing x, so you want to maximize (1-b)(1+2b) or
-2b^2 + b + 1. Local maxima and minima are found at the 0 points
of the derivate, so solve -4b + 1 = 0, and you find that the maximum
occurs at b = .25 (there are a lot of different routes that lead
to the same result).
--
"Stephen H. Landrum" <slan...@pacbell.net>

mbj...@hs.co.slc.ut.us

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
In article <6o44f4$rhi$1...@winter.news.erols.com>,

"Barbara Yoon" <by...@erols.com> wrote:
> MDuchin:
> > ...finance teacher...gave us this problem. ...flip of a fair coin, you get

> > twice your money back if it comes up heads and lose what you bet if
> > it's tails. ...continue doing this for as long as you want. You have $100.
> > What is your betting strategy to maximize your winnings? Certainly a
> > positive EV but if you bet it all, you will likely end up with nothing after
> > a few rounds. ...somewhat simulates playing pot limit/no-limit poker...
>
> Not quite clear what you mean here by "you get twice your money back,"
> but if it's 2-to-1 payoff odds, then I think the answer comes from the old
> "Kelly criterion" for bankroll building -- bet a quarter of your money every
time...
>
>
Could somebody please post the "Kelly criterion"? I have heard it mentioned
many times, and all I know is that it pertains somewhat to maximizing bankroll
and minimizing bankruptcy (or at least computing odds of bankruptcy).

To solve the problem, let's assume that we want to make our bankroll grow as
quickly as possible (so betting a penny at a time is not the best option),
and to do this let's say we decide to bet a percentage of our bankroll every
time, call this percentage "p", 0 < p < 1. Becuase a win multiplies our
bankroll by "1+2p" and a loss multiplies our bankroll by "1-p", we really
need to throw out addition-oriented EV and look at multiplication-oriented
EV. (Ignore the 50% advantage and look at expected "factor of
multiplication"). Clearly, in the long run, after "2N" flips of the coin, we
will have won and lost about "N" times. Therefore, our final bankroll will
be "(1-p)^N * (1+2p)^N" times as large as our original bankroll. To maximize
this, we will need to maximize "(1-p)*(1+2p)", which is "1+p-2p^2". We set
the derivative to zero to get "1-4p = 0", or "p=1/4". Therefore, the best
percentage to bet is 1/4. Every loss will result in your bankroll being
multiplied by 3/4, and every win will yield a factor of 3/2. So every two
flips (in the long run) will multiply your bankroll by 9/8, so after "2N"
flips, you should be close to (9/8)^N times your original bankroll.

I've actually been thinking about this type of problem in the last couple
weeks because I read a story (in "Intelligent Gambler"???) about a sic-bo
game where the casino erroneously set the payout odds for a 4 or a 17 at 80:1
when it should have been 60:1 (actual odds are 71:1). (Note: Sic-bo is a
betting game where the dealer throws three dice and the players can bet on
all sorts of outcomes, all of which have bad odds for the player except in
this case. For a 4 or 17 to be rolled, the dice must read 1-1-2 or 5-6-6,
which only happens in 1/72 rolls, so a casino giving 80:1 payouts is making a
mistake. By the way, I didn't even know the game existed until this story
and I've never seen it in a casino. Either the game is not popular, I am
unobservant, or both.) Anyway, some gamblers made huge amounts of money and
the casino lost a couple hundred grand (?) after a full day where EVERY
gambler on the table was making bets exclusively on 4 and 17, and finally
figuers out their mistake the next day (after checking the dice, the table,
etc. for gremlins). I thought it might be easy to go bust waiting for a 1/36
(if you bet on both) chance to happen (dry spell could take all your money,
even if it was a +EV bet).

Wanting to simplify things a bit, I "invented" a roulette game where a stupid
casino made a roulette wheel with 5 slots: 2 red and 3 black, and accepted
even odds on both. (BET BLACK, BABY!). Let's say you bet a percentage "p" of
your bankroll on every spin. This means that you end up with a factor 1-p or
1+p if you lose or win, respectively. In the long run you will win 3/5 of
the time and lose 2/5 of the time. Therefore, after "5N" spins, your
bankroll will have been multiplied by a factor of "(1+p)^3N * (1-p)^2N". To
maximize this, you need to maximize "(1+p)^3 * (1-p)^2", which is pretty
ugly, and this type of problem gets horribly ugly if your chance of winning
is something like 137/251. So how do you do this type of problem when the
probabilities aren't nice like 1/2 or 3/5? Is this where the "Kelly
criterion" comes in?

By the way, in the roulette problem above, trial and error gave me a "p"
value of 1/5 of my bankroll, but I don't know how to get this value easily. A
strange thing is this: if your bet 1/2 of your bankroll every time in the
roulette situation, your bankroll, in 5 spins, gets multiplied by a factor of
(3/2) * (3/2) * (3/2) * (1/2) * (1/2) = 27/32 !!! This means that you will go
bust in the long run in this great game if you keep betting half your money!

I guess in situations like this, finding the best value of "p" means
maximizing some ugly "(glop1)^q1 * (glop2)^q2 * (glop3)^q3 * - - - *
(glopN)^qN" equation, which might be made easier using logarithms, but that
might get even uglier. Here, there are N possible outcomes, the "glop"s are
the multiplicatve facotrs of each outcome and are functions of "p" (for
example, if outcome #3 gets only half of your bet back, the factor would be
"1 - p/2"), and the "q"s are the probability of each outcome.

Finally, (this is a long post), if you play a fair even money betting game,
like coin flipping w/o the 2:1 payout, it will *always* bankrupt you in the
long run if you bet a constant percentage of your income, because (1+p) *
(1-p) is less than 1 for any positive percentage of your income. This means
that in games where the odds are even (or especially against you) the
practice of betting bigger as you win just maximizes your chances of going
home a loser. You always hear gambling stories of someone who rode $100 and
a "rush" (let's not start that discussion again) to get up to $300,000 at a
craps table, only to have it washed away in a few quick huge bets. Because
some gamblers cannot control themselves when winning big, they always give
the money back to the house, because it seems the only thing that will make
them stop making larger and larger bets would be if they ended up bankrupting
the casino, and you'd need to have at LEAST six sigmas on your side of luck
to pull that off. Of course, if you set a limit on your bankroll, like
$2000, and promise to go home when that limit is reached, I don't think
you'll have a problem. You will, however, have a problem finding an even
odds game in a casino unless you like blackjack, poker, or video poker.)

Matthew Bjorge

(I don't ramble on when I talk, honest.)

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/rg_mkgrp.xp Create Your Own Free Member Forum

Barbara Yoon

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Lee Jones:

>> Kelly criterion (the Gambling 101 version) says you bet the percentage
>> of your bankroll equal to your advantage on that event, which maximizes
>> log(growth) of your bbankroll.

Stephen H. Landrum:


> No it does not. Betting the fraction equal to your advantage is an
> approximation that is close when the advantage is small and the
> the win rate is near 50/50.


Hmmm..."an approximation that is close when the advantage is small"...
You seem to imply here that it is not as "close" when the advantage is not
as "small" -- please explain...


Barbara Yoon

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Mike Duchin:

>>> ...flip of a fair coin, you get twice your money back if it comes up
>>> heads and lose what you bet if it's tails. ...continue doing this for
>>> as long as you want. What is your betting strategy to maximize

>>> your winnings? Certainly a positive EV but if you bet it all, you
>>> will likely end up with nothing after a few rounds.

B.Y.:


>> Not quite clear what you mean here by "you get twice your money

>> back," but if it's 2-to-1 payoff odds, then...from the old "Kelly criterion"


>> for bankroll building -- bet a quarter of your money every time...

Matthew Bjorge:


> Could somebody please post the "Kelly criterion"? I have heard it
> mentioned many times, and all I know is that it pertains somewhat to
> maximizing bankroll and minimizing bankruptcy (or at least computing
> odds of bankruptcy). To solve the problem, let's assume that we want
> to make our bankroll grow as quickly as possible (so betting a penny
> at a time is not the best option), and to do this let's say we decide to

> bet a percentage of our bankroll every time...

You ask about the Kelly criterion, and then you proceed to give (though I
haven't read your whole post in fine-tooth detail) a pretty good explanation
of it yourself...

> ...best percentage to bet [Mike Duchin's proposition] is 1/4.

> ...roulette wheel with 5 slots: 2 red and 3 black...even odds...need to
> maximize "(1+p)^3 * (1-p)^2"... Is this where the "Kelly criterion" comes in?
> ...trial and error gave me a "p" value of 1/5 of my bankroll, but I don't know
> how to get this value easily. ...might be made easier using logarithms...

Yes...and with "even odds," p = expected gain (see Stephen H. Landrum post)...

> ...you will go bust in the long run in this [positive expectation] game if you keep
> betting half your money! ...[even with no house advantage] the practice of

Stephen H. Landrum

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to

Since advantage can exceed 100%, it's obvious that it doesn't
apply for those cases.

In the particular case at hand, 50% advantage is a large advantage,
and betting 50% of your bankroll is nowhere near the kelly number,
which is 25% of your bankroll.

In the case of a coin flip wager, the kelly number is derived from
maximizing (1-b)(1+xb) where x is the multiplier on payoff for win.
The advantage of this bet is (x-1)/2. The appropriate fraction of
the bankroll to bet is (x-1)/2x. When the advantage is small, x
is close to one, and (x-1)/2 is close to (x-1)/2x.

Barbara Yoon

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Lee Jones:
>>>> Kelly criterion says you bet the percentage of your bankroll equal

>>>> to your advantage on that event, which maximizes log(growth) of
>>>> your bbankroll.

Stephen H. Landrum:
>>> No it does not. Betting the fraction equal to your advantage is an
>>> approximation that is close when the advantage is small and the the
>>> win rate is near 50/50.

B.Y.:


>> You seem to imply here that it is not as "close" when the advantage is
>> not as "small" -- please explain...

> Since advantage can exceed 100%, it's obvious that it doesn't apply
> for those cases. In the particular case at hand, 50% advantage is a
> large advantage, and betting 50% of your bankroll is nowhere near the
> kelly number, which is 25% of your bankroll.


But what about betting even-money on something known to produce
80% winners as compared to alternatively only 51% winners -- are you
suggesting that the Kelly number is a "closer approximation" (to what?)
in the latter "smaller advantage" case... Or perhaps I am misinterpreting?!


Stephen H. Landrum

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Mark Rafn wrote:
> I think Stephen is saying that the rule Lee used, "bet a percentage of
> your bankroll equal to your advantage" is a poor approximation of the
> Kelly number in these cases.
>
> He is NOT saying that the Kelly criterion is a poor approximation of
> optimizing long-run returns.

Exactly - sorry if my responses didn't make it clear. I knew what
I meant, so I was completely puzzled by BY's confusion.

Stephen H. Landrum

unread,
Jul 10, 1998, 3:00:00 AM7/10/98
to
Barbara Yoon wrote:
> But what about betting even-money on something known to produce
> 80% winners as compared to alternatively only 51% winners --

Here's a more generalized derivation of the Kelly number for
betting on a wager with two possible outcomes (a lot of betting
fits this form nicely). You win with probability p (losing with
probability 1-p), and when you win you win x times your wager.

The growth for a single bet b (where b is defined as the
portion of your bankroll that you are betting) is:
(1-b)^(1-p) * (1+xb)^p

Advantage for the bet is xp + p - 1.

Finding maxima for a function f(x)>0 is the same as finding
maxima for log(f(x)), so we want to find the maximum for
(1-p)log(1-b) + (p)log(1+xb), 0<b<1. Taking the derivative,
we get -(1-p)/(1-b) + xp/(1+xb). Solving for 0, we get
b = (xp+p-1)/x. When p is .5, this becomes (.5x-.5)/x or
(x-1)/2x matching the derivation in my earlier post.

When x = 1, adantage is equal to the Kelly number. So, what I
said earlier was overly restrictive. Betting your advantage is
close to the Kelly number when the payout is close to even money.

Barbara Yoon

unread,
Jul 11, 1998, 3:00:00 AM7/11/98
to
Lee Jones:
>> Kelly criterion says you bet the percentage of your bankroll equal to your

>> advantage on that event, which maximizes log(growth) of your bbankroll.

Stephen H. Landrum:
>> No it does not. Betting the fraction equal to your advantage is an
>> approximation that is close when the advantage is small and the
>> the win rate is near 50/50.

B.Y.:
>> You seem to imply here that it is not as "close" when the advantage is
>> not as "small" -- please explain...

Stephen Landrum:


>> Since advantage can exceed 100%, it's obvious that it doesn't apply
>> for those cases. In the particular case at hand, 50% advantage is a
>> large advantage, and betting 50% of your bankroll is nowhere near the
>> kelly number, which is 25% of your bankroll.

B.Y.:


>> But what about betting even-money on something known to produce

>> 80% winners as compared to alternatively only 51% winners -- are you
>> suggesting that the Kelly number is a "closer approximation" (to what?)
>> in the latter "smaller advantage" case... Or perhaps I am misinterpreting?!

Mark Rafn:


> I think Stephen is saying that the rule Lee used, "bet a percentage of your
> bankroll equal to your advantage" is a poor approximation of the Kelly
> number in these cases. He is NOT saying that the Kelly criterion is a poor
> approximation of optimizing long-run returns.

Stephen H. Landrum:


> Exactly - sorry if my responses didn't make it clear. I knew what I meant,
> so I was completely puzzled by BY's confusion.


The bets would have to be at EVEN-MONEY, and then Lee Jones' rule
would be EXACTLY the "Kelly criterion" -- right?! What confuses me here
is Stephen's characterization as "an approximation that is close when the
advantage is small and the the win rate is near 50/50"... The validity of Lee's
rule hinges on the even-money pay-off odds, but I just don't get what it is that
Stephen is saying about the relevance of "advantage" and "win rate"...


T.P.

unread,
Jul 11, 1998, 3:00:00 AM7/11/98
to
But how does all this relate to no-limit hold'em (which the original poster
wrote he thought it was similar)?
I don't think it has much to do with no-limit, since your decisions in
betting for the coin flips will not affect any other decision by your
opponent. We are assuming that you will get the same odds every time you
flip. In poker, either no-limit or limit, your opponents will adjust to
your style of play, at least to some degree. Thus they have a decision
process of their own in which your coin flip game does not.
So, I think the question is interesting, and the answers have been fun to
read, but as far as relevancy to poker, or no-limit specifically, I think
the relevancy is very low, if not zero.


jac...@xmission.com

unread,
Jul 11, 1998, 3:00:00 AM7/11/98
to
bp...@hoflink.com (Barry Paul) writes:

> On 10 Jul 1998 07:38:25 GMT, le...@diver.engr.sgi.com (Lee Jones)
> wrote:
>
> >In article <35A5BC...@dcn.davis.ca.us>,
> >Tom Weideman <zugz...@dcn.davis.ca.us> wrote:
> >
> >[You're offered prop bet paying 2:1 on a coin flip, play as long as you
> > like, you have $100. What's your betting strategy?]
> >>>
> >
> >[Physics-man, he say]
> >>You may not be happy with your hourly rate of return using this
> >>strategy, though. If this is the case, then betting according to the
> >>Kelly criterion (or something in between this and the minimum risk
> >>strategy) may be the answer you are looking for.
> >
> >I think Tom has it, but just to complete the lecture (he's the prof, I'm
> >just a grad student): In two flips, you'll lose one bet and win two, for
> >a net win of one bet over two trials. So you win .5 of a bet each trial,
> >thus you have a 50% edge (the blackjack players in the crowd are *drooling*
> >at this point).
> >

> >Kelly criterion (the Gambling 101 version) says you bet the percentage

> >of your bankroll equal to your advantage on that event, which maximizes
> >log(growth) of your bbankroll.
> >

> >So, you bet $50. If you win, you have $200 (remember, you get paid 2:1),
> >and on your next bet, you bet $100. If your first bet loses, you now
> >bet $25 (a reverse Martingale, actually), continually halving your bet
> >until you win. Each time, you bet 50% of the money in your hand.
> >
> >You are likely to win money at a truly astonishing rate.
> >
> >Tom, Abdul, how'd I do?
>

> This thread (or at least the part I've seen) needs a little
> clarification.
>
> If your betting units are infinitely divisible then the suggested
> betting pattern is fine. You cannot go bust and will, with probability
> 1, eventually enjoy a streak that will make your winnings arbitrarily
> large. In other words, pick any number as your cash in point and the
> probability is 1 that you will reach your goal.
>
> If however there is a lower limit on your bet, say one cent, then you

> should imitate your favorite casino and bet one cent at a time.

Assuming the one cent limit, are you claiming that betting one cent
will be optimal even if you have $10,000 in your pocket? Sorry, but
that's just silly.

Log optimal betting does *NOT* require that the bankroll be infinitely
divisible. The consequence of a lower limit is that if your bankroll
drops to a certain level, the player will be compelled to stop betting
even though they have some money left. This point is reached when a
bet of one cent will have negative utility because you are overbetting
your bankroll. This crossover occurs when the bet is exacly 2X the
Kelly optimal bet.

Conclusion: Bet half your bankroll each time. If your bankroll drops
to one cent, then put that penny back in your pocket and walk away.
If you start with $100, your chances of walking away with only a penny
are very small.

Caveat: this assumes you want to use Kelly principles to guide your
betting. Other measures of utility will yield different results.
Some believe that Kelly is the "one true gambling religion", but I've
never seen a convincing proof that any utility function is inherently
better than any other.
--
Steve Jacobs (jac...@xmission.com) \ Do you play Video Poker? Try VP Freebie
"Expectation isn't everything..." \ http://www.conjelco.com/vpfreebie.html

Stephen H. Landrum

unread,
Jul 11, 1998, 3:00:00 AM7/11/98
to
T.P. wrote:
>
> But how does all this relate to no-limit hold'em (which the original poster
> wrote he thought it was similar)?

It relates to how much of your bankroll you should be willing
to put on the table given a particular lineup of players and
how you expect to do in a given session. You need a very good
estimate of the possible distribution of outcomes to apply it
though.

> I don't think it has much to do with no-limit, since your decisions in
> betting for the coin flips will not affect any other decision by your
> opponent. We are assuming that you will get the same odds every time you
> flip. In poker, either no-limit or limit, your opponents will adjust to
> your style of play, at least to some degree.

If you were given different odds on every coin flip, you could
still work out the math for each one. Kelly numbers work fine
when the odds change, you get new numbers.

> Thus they have a decision
> process of their own in which your coin flip game does not.

As your opponents adapt (which you better be aware of), you
should be adjusting as well.

> So, I think the question is interesting, and the answers have been fun to
> read, but as far as relevancy to poker, or no-limit specifically, I think
> the relevancy is very low, if not zero.

It can be relevant to the play of the hands in no limit -
basically just because you have a theoretical edge doesn't mean
all of your chips should go into the center.

It really doesn't apply at all to limit play unless your bankroll
is so short that you really shouldn't be playing the game anyway.

jac...@xmission.com

unread,
Jul 11, 1998, 3:00:00 AM7/11/98
to
"Stephen H. Landrum" <slan...@pacbell.net> writes:

> Barbara Yoon wrote:
> > But what about betting even-money on something known to produce
> > 80% winners as compared to alternatively only 51% winners --
>

> Here's a more generalized derivation of the Kelly number for
> betting on a wager with two possible outcomes (a lot of betting
> fits this form nicely). You win with probability p (losing with
> probability 1-p), and when you win you win x times your wager.
>
> The growth for a single bet b (where b is defined as the
> portion of your bankroll that you are betting) is:
> (1-b)^(1-p) * (1+xb)^p
>
> Advantage for the bet is xp + p - 1.
>
> Finding maxima for a function f(x)>0 is the same as finding
> maxima for log(f(x)), so we want to find the maximum for
> (1-p)log(1-b) + (p)log(1+xb), 0<b<1. Taking the derivative,
> we get -(1-p)/(1-b) + xp/(1+xb). Solving for 0, we get
> b = (xp+p-1)/x. When p is .5, this becomes (.5x-.5)/x or
> (x-1)/2x matching the derivation in my earlier post.

In "The Mathematics of Gambling", Thorp presents this as:

f* = e/A,

where f* is the optimal betting fraction, e is expectation, and A
is the payoff for a unit bet. Steve's derivation here is similar
to Thorp's.

Factoring in a minimum bet size of one penny, the Kelly optimal strategy
is to bet 1/4 of your bankroll. If your bankroll drops to four cents, then
betting one penny will still give optimal expected utility. If you only
have three cents left, betting a penny will not be Kelly optimal but it
will give positive utility. With a bankroll of two cents, betting a penny
will yield a "fair" bet -- you neither expect to increase or decrease your
utility by taking the bet. Those who err on the side of caution would
walk away with two cents at this point :-)

Zagie

unread,
Jul 12, 1998, 3:00:00 AM7/12/98
to
In article <35A6214A...@pacbell.net>, "Stephen H. Landrum"
<slan...@pacbell.net> writes:

>Betting 50% will give you a rocky ride with a long run expectation
>of ending up at your original amount (assuming that you can
>subdivide your money infinitely). Betting more than 50% will give
>you an ever diminishing bankroll. Betting less than 50% will give
>you an ever increasing bankroll.

Wow! I hope you have your asbestos suit on. I'm going to stop reading this
thread, because I've gotten all I can out of it and I know that the rest of the
thread will be flaming your analysis.

Zag

who thinks that tails, then heads, is just as likely a combination as heads,
then tails.
who also thinks that you'd have a hard time making any bet in this game
negative EV bet, even though it is pretty easy to bottom out if you try.


----------------------------------
ZagNet Consulting
http://www.zag.net


Stephen H. Landrum

unread,
Jul 12, 1998, 3:00:00 AM7/12/98
to
Zagie wrote:
>
> In article <35A6214A...@pacbell.net>, "Stephen H. Landrum"
> <slan...@pacbell.net> writes:
>
> >Betting 50% will give you a rocky ride with a long run expectation
> >of ending up at your original amount (assuming that you can
> >subdivide your money infinitely). Betting more than 50% will give
> >you an ever diminishing bankroll. Betting less than 50% will give
> >you an ever increasing bankroll.
>
> Wow! I hope you have your asbestos suit on. I'm going to stop reading this
> thread, because I've gotten all I can out of it and I know that the rest of the
> thread will be flaming your analysis.

You lose that bet.

> Zag
> who thinks that tails, then heads, is just as likely a combination as heads,
> then tails.
> who also thinks that you'd have a hard time making any bet in this game
> negative EV bet, even though it is pretty easy to bottom out if you try.

Then you didn't understand the point of my post at all.

Since you are making bets proportional to your bankroll, the
expected effect of all of the outcomes is the multiplication of
all of the possible outcomes, where the outcomes are expressed as
multipliers on the bankroll. A common way of talking about this
is that you are calculating expectation on log(bankroll) instead
of directly on bankroll.

As for tails/heads vs. head/tails, if you bet $50, lose, bet $25,
then win, you also end up at $100.

As long as the number of wins and losses is equal, then betting 1/2
of your bankroll on this proposition repeatedly has the expectation
of leaving you exactly where you started.

Note, this is completely different that computing the expectation
of the average of the bets (by summing instead of multiplication).
It's true that every bet is made with positive expectation, and
the expectation after any series of bets is positive, but when
overbetting your bankroll the positive expectation comes because
you've averaged a few lucky series of wins with an astronomical
income with a lot of series that end up losing income.

Stuart Resnick

unread,
Jul 12, 1998, 3:00:00 AM7/12/98
to
[re: betting on a coin flip, with wins paying 2 to 1]

Stephen H. Landrum wrote:
> > >Betting 50% will give you a rocky ride with a long run expectation
> > >of ending up at your original amount (assuming that you can
> > >subdivide your money infinitely).
>

> As long as the number of wins and losses is equal, then betting 1/2
> of your bankroll on this proposition repeatedly has the expectation
> of leaving you exactly where you started.
>

> It's true that every bet is made with positive expectation, and
> the expectation after any series of bets is positive, but when
> overbetting your bankroll the positive expectation comes because
> you've averaged a few lucky series of wins with an astronomical
> income with a lot of series that end up losing income.

You're not averaging "a few" lucky series. For any given number of
flips, your positive expecation comes from averaging those lucky
possible results that win large income with *an equal number* of unlucky
results that lose smaller income. For an even # of flips, you *could*
say that ending up where you started is a more likely result than any
specific $amount gain or loss (i.e., coming out even is more likely than
winning exactly $200 & also more likely than losing exactly $50, etc).
You could also say that your chance of winning something is equal to
your chance of losing something, which may be useful information. But
it's *not* the same as saying your expection is to end up where you
started. A favorable result is equally likely as an unfavorable one, but
even a slight favorable result yields a large reward, relative to losses
on slightly unfavorable results. Likewise for moderately favorable
relative to moderately unfavorable, etc.

Stuart
sres...@slip.net
http://www.slip.net/~sresnick/mypage.shtml

Stephen H. Landrum

unread,
Jul 12, 1998, 3:00:00 AM7/12/98
to
Stuart Resnick wrote:

> [re: betting on a coin flip, with wins paying 2 to 1]

> Stephen H. Landrum wrote:
> > It's true that every bet is made with positive expectation, and
> > the expectation after any series of bets is positive, but when
> > overbetting your bankroll the positive expectation comes because
> > you've averaged a few lucky series of wins with an astronomical
> > income with a lot of series that end up losing income.

> You're not averaging "a few" lucky series. For any given number of
> flips, your positive expecation comes from averaging those lucky
> possible results that win large income with *an equal number* of unlucky
> results that lose smaller income. For an even # of flips, you *could*
> say that ending up where you started is a more likely result than any
> specific $amount gain or loss (i.e., coming out even is more likely than
> winning exactly $200 & also more likely than losing exactly $50, etc).

I was no longer talking about betting 1/2 of your bankroll,
by overbetting here I meant betting more than 1/2 of your bankroll.

Sorry, I didn't reread it to make sure my thoughts were complete.

In this case, after a number of bets the likelihood that you still
have more than you started with gets very small. At the extreme,
say you bet 100% of your bankroll - now after say 10 bets, you
either have almost $6 million or you are broke. The EV for this
situation is positive ($5666), but 99.9% of the time your outcome
is to be broke.

> You could also say that your chance of winning something is equal to
> your chance of losing something, which may be useful information. But
> it's *not* the same as saying your expection is to end up where you
> started. A favorable result is equally likely as an unfavorable one, but
> even a slight favorable result yields a large reward, relative to losses
> on slightly unfavorable results. Likewise for moderately favorable
> relative to moderately unfavorable, etc.

When betting 50% of your bankroll on the particular proposition,
the expected change of log(bankroll) is zero. It's equally
likely to double as it is to halve. This is an important concept
in investing or wagering where preserving a bankroll is important.

Tom Weideman

unread,
Jul 12, 1998, 3:00:00 AM7/12/98
to
Stephen H. Landrum wrote:

> Then you didn't understand the point of my post at all.
>
> Since you are making bets proportional to your bankroll, the
> expected effect of all of the outcomes is the multiplication of
> all of the possible outcomes, where the outcomes are expressed as
> multipliers on the bankroll. A common way of talking about this
> is that you are calculating expectation on log(bankroll) instead
> of directly on bankroll.
>
> As for tails/heads vs. head/tails, if you bet $50, lose, bet $25,
> then win, you also end up at $100.
>

> As long as the number of wins and losses is equal, then betting 1/2
> of your bankroll on this proposition repeatedly has the expectation
> of leaving you exactly where you started.
>

> Note, this is completely different that computing the expectation
> of the average of the bets (by summing instead of multiplication).

> It's true that every bet is made with positive expectation, and
> the expectation after any series of bets is positive, but when
> overbetting your bankroll the positive expectation comes because
> you've averaged a few lucky series of wins with an astronomical
> income with a lot of series that end up losing income.

I was hoping this thread would die out so I could use this little quirk
as an April Fool's prank next year (I really plan well ahead!), but
alas, SL has let it out of the bag. It seems strange that you can be
assured of getting busted when every single bet has positive expectation
(My April Fool's joke was going to have some "Money Management"
rhetoric... darn you, SL!!).

Anyway, to see it most clearly, assume you employ the strategy of
betting it all and then "letting it ride" each time you win. It should
be clear that if your probability of winning in each game is less than
unity, then you will eventually go bust. Although it is less clear for
the above case where you bet 1/2 your bankroll each time, it works the
same way. What is interesting is that this geometric betting scheme has
only three results (assuming your bankroll is infinitely divisible),
depending on the (fixed) fraction of bankroll you wager each time, your
probability of winning, and odds offered: 1. Eventually win all of the
"house's" infinite bankroll with 100% certainty; 2. Eventually go bust
with 100% certainty; 3. Have a long-term expectation of staying exactly
even.

If you bet the same amount each time (no matter how big this bet is)
rather than increase or decrease according to bankroll size, then the
results are different. Assuming you are playing with an edge, you may
or may not go bust, and you can compute the probabilities that either
you or the infinite-bankrolled "house" will be busted in the end (if you
are playing a fair game or at a disadvantage, then you are assured of
being busted).

In other words, alla doze ``money management'' G00R00's aren't so rong
after all!1!!11! <grinz>


Tom Weideman

sres...@slip.net

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
slan...@pacbell.net wrote:
> [re: an "coin-flip" proposition, with a win paying 2 to 1]

> When betting 50% of your bankroll on the particular proposition,
> the expected change of log(bankroll) is zero. It's equally
> likely to double as it is to halve. This is an important concept
> in investing or wagering where preserving a bankroll is important.

Please explain why this concept is important. Simple-minded fellow that I am,
I'm concerned with the expected change of my bankroll, as well as the variance
(i.e., a big chance at a small win may have a different personal value vs a
small chance at a big win). But I don't see why I should be concerned with the
change of my log(bankroll). When I go to spend my winnings, I'll be paying
"prices", not "log(prices)".

In other words, I'd jump at the opportunity of an equal likelyhood of doubling
or halving my investment. Why is the fact that this translates into a zero
expected change of log(bankroll) significant to me?

As discussed before, in this proposition, the 50%-of-bankroll strategy
produces an equal chance of winning something as of losing something. Is this
related to the zero expected change of log(bankroll)? In any case, it's not a
terribly useful statistic, since we're more than a little concerned with the
actual value of these wins/losses.

Stuart
sres...@slip.net
http://www.slip.net/~sresnick/mypage.shtml

Joesmallie

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
Please correct me if I am wrong. I used to use the Kelly Criterion for
playing the horses. If I remember correctly it was "edge diveded by odds." We
then made an adjustment whereby we used a 1/2 Kelly. When you say Kelly
Criterion as applied to a poker situation are you doing the same thing? Could
you please define Kelly Criterion. Is there a web site that has a thorough
explaination. I am less than a stellar mathematician. When people start
posting formulas and such about how they derived certain calculations I become
blurry eyed. Would love to be able to follow them better but they leave me in
the dust. I have math questions all the time but hesitate to bring them up here
and bog down the poker newsgroup. Would love to have a math tutor. Thankyou.


Alan Bostick

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
In article <6o5fqg$hub$1...@nnrp1.dejanews.com>, mbj...@hs.co.slc.ut.us wrote:

> Could somebody please post the "Kelly criterion"? I have heard it mentioned
> many times, and all I know is that it pertains somewhat to maximizing bankroll
> and minimizing bankruptcy (or at least computing odds of bankruptcy).

The "Kelly criterion" is discussed in the document "Frequently Asked
Questions about Kelly Betting", which can be found at
<http://www.lds.co.uk/tomt/frbjkelly.html>. This has been posted on
occasion to Usenet (typically to rec.gambling.blackjack).

Kelly betting is more applicable to gambling and risk-taking situations
where one's return and the degree of risk one is taking are known
quantities (e.g. in a blackjack game, using a card-counting strategy).
The risk and return in poker is more difficult to quantify, because
it depends on so many intangible factors (e.g. the mood of the other
players).

--
| Many that live deserve death. And some that die
Alan Bostick | deserve life. Can you give it to them? Then do
mailto:abos...@netcom.com | not be too eager to deal out death in judgment.
news:alt.grelb | J. R. R. Tolkien
http://www.alumni.caltech.edu/~abostick

Alan Bostick

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
In article <35A785BB...@pacbell.net>,
"Stephen H. Landrum" <slan...@pacbell.net> wrote:

> T.P. wrote:

> > So, I think the question is interesting, and the answers have been fun to
> > read, but as far as relevancy to poker, or no-limit specifically, I think
> > the relevancy is very low, if not zero.
>
> It can be relevant to the play of the hands in no limit -
> basically just because you have a theoretical edge doesn't mean
> all of your chips should go into the center.

True . . . with the caveat that the chips in front of you probably ought
not to be your total no-limit bankroll, nor even your stake for that
particular session.

My opinion is that Kelly-type thinking is useful to decide how much
money you should bring to a game, perhaps even how much to buy in for.
It becomes less important for the play of individual hands. (If one
isn't prepared to risk one's entire stack and lose, one shouldn't be
playing no-limit.)

On the other hand, in support of your point, Steve, Doyle Brunson writes
in ACCORDING TO DOYLE of a hand in his youth where he was playing against
a drunk obviously on a flush draw. Brunson had the best of it and went
all-in against the drunk . . . whose flush got there. Brunson was broke
for the evening, and his buddies got to relieve the drunk of his chips.
Brunson writes something to the effect that if you're reasonably sure of
breaking an opponent sooner or later, you might as well wait until your
edge is big rather than risking everything when your edge is small.

Stephen H. Landrum

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
Joesmallie wrote:
>
> Please correct me if I am wrong. I used to use the Kelly Criterion for
> playing the horses. If I remember correctly it was "edge diveded by odds."

That's a fair restatement for a two outcome (win,lose) wager.

> We
> then made an adjustment whereby we used a 1/2 Kelly.

Not a bad idea if you are really concerned about preserving
bankroll. Playing the horses is also complicated by the fact that
your wager can move the odds.

> When you say Kelly
> Criterion as applied to a poker situation are you doing the same thing?

Not really - as Tad said, it doesn't often really apply to poker.
I was pointing out that a strong no-limit player might want to
take it into consideration when deciding how much money to put
at risk in certain situations.

> Could
> you please define Kelly Criterion.

When making bankroll proportional wagers with positive
expectation, Kelly optimizes the rate at which your bankroll
is expected to grow.

> Is there a web site that has a thorough
> explaination.

I don't know. Perhaps someone else knows and can post a pointer
to it.

Stephen H. Landrum

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
sres...@slip.net wrote:
> As discussed before, in this proposition, the 50%-of-bankroll strategy
> produces an equal chance of winning something as of losing something. Is this
> related to the zero expected change of log(bankroll)? In any case, it's not a
> terribly useful statistic, since we're more than a little concerned with the
> actual value of these wins/losses.

Would you bet every dime you had on this wager? Would you take
out as many loans as you could to do it? If not, why not? After
all, you're concerned with the actual value of these wins/losses,
and since wins are twice losses, it should be a good thing, right?
And if you were willing to bet it all, why not do that repeatedly?

When you bet 1/2 of your bankroll repeatedly on this wager, you
will sometimes come out ahead, and sometimes come out behind.

If you bet less than 1/2 of your bankroll and do it repeatedly,
then you are guaranteed over the long term to come out ahead.
The "optimum" number for bankroll growth on this wager is to bet
1/4 of your bankroll. If you do this, after a relatively short
while you'll be usually be far ahead of where you would have been
betting 1/2 of your bankroll. The dollar expectation is higher
betting 1/2 of your bankroll, but that average is skewed by
unlikely events where you make astronomical amounts.

Most people are concerned with what's actually likely to happen,
and really don't want to depend on the luck of the lottery to
determine their fate.

Barbara Yoon

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
Stuart Resnick [on getting 2-to-1 odds on coin flips]:

> I don't see why I should be concerned with the change of my
> log(bankroll). When I go to spend my winnings, I'll be paying
> "prices", not "log(prices)".

The logarithms are just a way to make the math easier, keeping all
"greater-than" relationships the same -- more log("money") is also
more just plain "MONEY"...

> I'd jump at the opportunity of an equal likelyhood of doubling or
> halving my investment. Why is the fact that this translates into a
> zero expected change of log(bankroll) significant to me?

Maybe if getting the 2-to-1 odds for a LIMITED number of coin flips
-- if only ONE flip, then what the heck, bet it all -- but with unlimited
flips, go with betting the Kelly 25% each time...

> ...log(bankroll)? ...not a terribly useful statistic, since we're more


> than a little concerned with the actual value of these wins/losses.


Think of the "crazy millionaire" standing there giving all comers the
2-to-1 odds all day long, but on the very first flip, YOU bet your whole
bankroll (odds in your favor -- right?!), and LOST, and now all you can
do is stand around and watch all the other players delightedly getting
the 2-to-1 all the rest of the day -- and with that kind of action available,
no way anybody there is gonna lend you any of their money!


Tom Weideman

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
Joesmallie wrote:
>
> Please correct me if I am wrong. I used to use the Kelly Criterion for

I haven't checked the r.g.* FAQ, so this may be redundant, but since you
asked, I'll give as simplified a version as I can. I'll only show
details of the easy math, while glossing over the calculus, so non-math
types can (hopefully) follow along.

Suppose you are playing a game in which you have some sort of edge.
This edge can consist of any combination of probability and odds such
that each bet has a positive expectation. Now you ask yourself, "What
fraction of my bankroll should I wager on this game?" If your answer
is, "The amount that if I keep repeating this same strategy over a long
period of time, my bankroll increases at its maximum rate.", then your
answer is to bet "Kelly".

To see how we can find such an answer, we should look at it from the
beginning. Suppose you bet a fraction f (0<f<1) of your current
bankroll B, and you are getting odds of v-to-1. If you win the bet, you
get back your original bet, plus v times the amount of your bet (which
was f*B). If you lose, you lose the bet amount:

Bet size = f*B, odds = v-to-1, amount won = v*f*B, amount lost =
f*B

If you win, your new bankroll B' is going to equal your old bankroll
plus your winnings:

Win: B' = B + v*f*B = (1+v*f)*B

If you lose, your new bankroll will equal your old bankroll minus your
losses:

Lose: B' = B - f*B = (1-f)*B

Let's take a short timeout to make sure these funny looking equations
make sense by looking at an example (I'm a huge fan of concrete
examples).

Say you have $100, and you bet $20, getting 2-to-1. For the sake of
future considerations, we'll assume the game is a fair coin flip (but
the need for the probabilities does not come into play for awhile yet).
If you win this bet, you win $40, raising your bankroll to $140. The
equation above shows this correctly:

B = $100, f = 1/5 = 0.2, v = 2: B' = (1+v*f)*B = (1+2*0.2)*($100) =
$140

The "Lose" equation works in a similar manner.

Now, suppose you play the game multiple times. Assuming no "pushes"
(which I will be assuming throughout), then the "new" bankroll (which we
have called B') after a game is played becomes the "old" bankroll for
the next game to be played. We are assuming the game and the odds
offered don't change, and that we do not choose to change our strategy
(the fraction of bankroll to be wagered), so the equations given remain
the same for each game, with the previous game's B' becoming the next
game's B. Your result after winning twice in a row would be:

B after two wins = (1+v*f)*[B after one win] =
(1+v*f)*[(1+v*f)*(starting B)]
B after two wins = [(1+v*f)^2]*(starting B)

A check: Suppose in the example above you won again, employing the same
strategy. Your new bankroll was $140, so you wagered 1/5 of it ($28)
and won at 2-to-1, for a total win of $56. Now your new bankroll is:
$140+$56 = $196. Plugging into the equation gives the same result:

B after two wins = [(1+v*f)^2]*(starting B) = [(1+2*0.2)^2]*($100)
B after two wins = [(1.4)^2]*($100) = [1.96]*($100) = $196

The two losses results work out the same way. What about a win and a
loss, you ask? You just multiply the starting bankroll by the "lose
factor" (1-f), and the "win factor" (1+v*f), and the result is the new
bankroll. Note that it doesn't even matter if you lost first or won
first. Back to our example:

You won the first game and lost the second: $100 + $40 = $140 after
first game, then lost $28 (you wagered 1/5 of it) in the second game for
a total of $140-$28 = $112. If you lost first, you have $100 - $20 =
$80 after the first game, and then you wager 1/5th of $80 (=$16) at
2-to-1 in the second game and win, for a total of $80 + 2*($16) = $112.
Same amount as if you win first. [BTW, this less than obvious fact may
affect retirement planning for those of you thinking about Roth IRA's
vs. traditional ones. I won't digress any further on this topic.]

Using the equation gives the same result:

B after 1 win and 1 loss = (1+v*f)*(1-f)*(starting B)
B after 1 win and 1 loss = (1+2*0.2)*(1-0.2)*($100) = (1.4)*(0.8)*$100 =
$112

Okay, so lets say we have won "w" times out of n total games. This
means we have lost n-w times (since we assumed no pushes). To find the
new bankroll, we need to multiply the starting bankroll by (1+v*f) a
total of w times, and multiply it by (1-f) a total of n-w times. In
other words, our "new bankroll equation" has gotten much more
complicated:

B' = [(1+v*f)^w]*[(1-f)^(n-w)]*B

The first factor in brackets is just the "win factor" multiplied by
itself w times, and the second is the "lose factor" multiplied by itself
n-w times. The two "B's" in this equation are not important, so we will
instead look at just the factor multiplying B, as this is what
determines the bankroll's growth (or lack thereof):

B'/B = [(1+v*f)^w]*[(1-f)^(n-w)]

This gives us the factor that the bankroll has changed from the
beginning (after n games). We want to look at the bankroll's growth
rate PER GAME, so if we call the average-per-game-factor "y", then after
n games, the bankroll has grown by a factor of y^n. The
average-per-game-factor is then found to be:

B'/B = y^n

y = (B'/B)^(1/n) = {[(1+v*f)^w]*[(1-f)^(n-w)]}^(1/n)

Again we return to our example. After 2 games where we won 1 and lost
1, we have gone from $100 to $112. This is an increase of a factor of
1.12 for 2 games. On average, this is an increase PER GAME of a factor
of sqrt(1.12) = 1.058 . In other words, if in the game described, we
win the same number that we lose (i.e. we assumed way back when that the
probability of winning is 1/2), on average we will increase our bankroll
each game by a factor of 1.058. Now here's the important part:

*** If we employ a different strategy (risk a different fraction "f" of
the bankroll), then this factor will also change. We seek to find the f
for which this factor is a MAXIMUM. ***

Now finding the value of f for which this function peaks is no small
matter. It involves calculus. If this intimidates you, I invite you to
jump down to below the second set of "*'s" to see the answer. I include
the calculus for the math.weenies that may find it interesting...

********

The value of f for which y(f) is a maximum is the same value for which
ln[y(f)] is a maximum, so we can equivalently seek to maximize:

z(f) = ln[y(f)] = (1/n)*[w*ln(1+v*f) + (n-w)*ln(1-f)]

The derivative is:

dz/df = (w/n)*v/(1+v*f) - [1-w/n]/(1-f)

Setting equal to zero and finding f gives:

********

f = [p*(v+1)-1]/v, where p = w/n.

Note that in the long run, the fractional number of times you will win a
game (w/n) equals the probability of winning a single game, so p =
probability of winning.

Okay, this is our answer. Given that your chance of winning is p, and
that you are receiving v-to-1 odds on your bet, then the fraction of
your bankroll that you should wager to maximize your rate of bankroll
growth is the f given above. This value can also be plugged back in
above to find out what the maximum growth rate actually comes out to be.

Let's try it for the game we've been using as an example. We have a
probability of p=0.5 of winning, and odds of v=2, so the fraction of our
bankroll we should risk each time is:

f = [p*(v+1)-1]/v = [0.5*(2+1)-1]/2 = 1/4

y = [(1+v*f)^p]*[(1-f)^(1-p)] = 1.061 (recall p=w/n)

The bankroll increases an average of 6.1% over its preceding amount
every game. [Using the "rule of 72" familiar to bankers and investors,
this means the bankroll will double roughly every 12 games.]

Most people like to remember the Kelly criterion using a pnemonic that
goes something like: "Bet the fraction of your bankroll that equals your
percentage advantage." It should be understood that this ONLY applies
to bets with even money odds (v=1). Note that with v=1, f comes out to
be equal to 2p-1, which is exactly your percentage advantage. The
origin of this pnemonic is probably from blackjack, which would explain
why the even money assumption is made. A better, more general pnemonic
would be:

"Bet the fraction of your bankroll equal your percentage advantage
divided by the 'to-1' odds."

For example, if you have a 10% advantage and you are getting 5-to-2
odds, then the fraction of your bankroll to bet is (0.10)/2.5 = 0.04 =
4%.

If you plowed through this whole post, my compliments. It's possible
that the only people willing (able?) to follow it all the way through
are the people who already understand all this, which means I was
drawing dead as I wrote it. I hate when that happens.


Tom Weideman

Barbara Yoon

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
Tom Weideman:
> ...simplified...details of the easy math, while glossing over the calculus,
> so non-math types can (hopefully) follow along. << thorough explanation,
> examples >> "Bet the fraction of your bankroll equal your percentage
> advantage divided by the 'to-1' odds." ...if you have a 10% advantage

> and you are getting 5-to-2 odds, then the fraction of your bankroll to bet
> is (0.10)/2.5 = 0.04 = 4%. If you plowed through this whole post, my
> compliments. It's possible that the only people willing (able?) to follow it
> all the way through are the people who already understand all this, which
> means I was drawing dead as I wrote it. I hate when that happens.


No matter what...nice job there, Tom...


Dave Horwitz

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to

Tom Weideman wrote in message <35AA9...@dcn.davis.ca.us>...

>I haven't checked the r.g.* FAQ, so this may be redundant, but since you
>asked, I'll give as simplified a version as I can. I'll only show
>details of the easy math, while glossing over the calculus, so non-math
>types can (hopefully) follow along.
>
>Suppose you are playing a game in which you have some sort of edge.
>This edge can consist of any combination of probability and odds such
>that each bet has a positive expectation. Now you ask yourself, "What


You failed Tom... I quit reading here and skipped to the bottom. As
soon as I read "combination of probability and odds such that ... positive
expectation" I assumed that the following wasn't going to include any
descriptive words like "small", "medium", "large", and "*huge*". I'm
absolutely positive that it was very concise and well written but the
scroll bar on my window was less than an inch long. That pretty much
did it. Next time you want to address us "non-math" types use this as
your criteria: Scroll bar no less than 1" and an explanation that you'd
give at a party where there were more than 2 couples dancing.

>
>If you plowed through this whole post, my compliments. It's possible
>that the only people willing (able?) to follow it all the way through
>are the people who already understand all this, which means I was
>drawing dead as I wrote it. I hate when that happens.


ding!

-Quick (don't get discouraged, you're still a hero)

Barbara Yoon

unread,
Jul 13, 1998, 3:00:00 AM7/13/98
to
Mike Caro:
> BTW, I didn't mean to quote the Tom's entire post in my response.


Horror!! You stand self-accused of the Internet's newest contribution
to the list of Cardinal Sins -- WASTING BANDWIDTH!!


Mike Caro

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
Tom --

That was excellent. I read it all the way through, because, well, I
needed a break from my workday tedium. Your post entertained and
relaxed me, which may give you an idea just how sick I really am.
Don't tell.

Straight Flushes,
Mike Caro

Mike Caro

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to

Mike Caro

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
Barbara --

I, also, think the overreaction to this particular sin is sometimes
amusing. But, in this case, I really didn't mean to quote the whole
message, and I wanted to head off the onrush of spears, which I felt
were about to be launched. I'm very sensitive and fragile in this
regard.

Straight Flushes,
Mike Caro

On Mon, 13 Jul 1998 22:25:59 -0400, "Barbara Yoon" <by...@erols.com>
wrote:

>Mike Caro:


>> BTW, I didn't mean to quote the Tom's entire post in my response.
>
>

James P. Massar

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
Tom Weideman <zugz...@dcn.davis.ca.us> wrote:

>Most people like to remember the Kelly criterion using a pnemonic that
>goes something like: "Bet the fraction of your bankroll that equals your
>percentage advantage." It should be understood that this ONLY applies
>to bets with even money odds (v=1). Note that with v=1, f comes out to
>be equal to 2p-1, which is exactly your percentage advantage. The
>origin of this pnemonic is probably from blackjack, which would explain
>why the even money assumption is made.

A blackjack hand is not always paid at even money (double-downs, BJ's,
surrenders, etc).

Blackjack people often use "Bet your bankroll times your advantage
on a given round divided by the variance of a hand of blackjack".


Jeffrey L. Woods

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
MDuchin wrote:
>
> proposition. You can bet only the money you have on you (no checks
> or credit cards) and from the flip of a fair coin, you get twice your
> money back if it comes up heads and lose what you bet if it's tails.
> He tells you he will continue doing this for as long as you want.
> You have $100. What is your betting strategy to maximize your
> winnings?

This is a no-brainer. I'll bet a small, fixed percentage of my stack
every bet. I'm sure you could mathematically analyze what std. dev.
you'd be willing to accept in exchange for spending less time, but I'd
probably bet 5% of my stack on each proposition, meaning I'd have to
lose a BUNCH of times in a row (far more than 20) to go broke, yet
having a 2:1 edge, the LONG RUN means that after a few hundred bets, it
is EXTREMELY likely that I'd be betting hundreds or thousands, and could
continue virtually indefinitely if I lowered that percentage. To help
ensure you reach that LONG run, lower your earlier percentage bet. It'll
take longer, but your "bankroll" will be safer.

The key here is that the guy will make this same -ev (for him) bet as
long as you like. I don't think it applies to poker in the manner you
suggested because you're not likely to be a 2:1 favorite EVERY hand you
play.

Zagie

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
In article <35A95706...@pacbell.net>, "Stephen H. Landrum"
<slan...@pacbell.net> writes:

>As for tails/heads vs. head/tails, if you bet $50, lose, bet $25,
>then win, you also end up at $100.

I see your point in looking at this in pairs of flips, but complete the set:

Starting with $100, always betting half of bankroll:

HH: $100 -> $200 -> $400
HT: $100 -> $200 -> $100
TH: $100 -> $050 -> $100
TT: $100 -> $050 -> $025

Add them up and divide by 4, you have an average ending bankroll of $625/4 or
156.25 for a 56.25% win rate.

Or, you could look at them in groups of three tosses:

HHH: $100 -> $200 -> $400 -> $800
HHT: $100 -> $200 -> $400 -> $200
HTH: $100 -> $200 -> $100 -> $200
HTT: $100 -> $200 -> $100 -> $050
THH: $100 -> $050 -> $100 -> $200
THT: $100 -> $050 -> $100 -> $050
TTH: $100 -> $050 -> $025 -> $050
TTT: $100 -> $050 -> $025 -> $012.5

for an average bankroll ending of $195.3125. Four four flips your average can
be calculated by squaring the average result for two flips (i.e. $100 *
(1.5625)^2 ), etc.

I really don't think that you will stay the same, betting that way. The
incomplete analysis is why people are able to sell roulette "Strategies." It
is true that most of these strategies *usually* win money, but the rare times
that they lose, they lose more money than they won in all the wins combined.
You can *guarantee* a win when you sell these, because again, they usually win.
If someone is so unlucky as to lose $10230 the first time, you can easily
refund him his $5. The other 1000 people who paid you $5 won on their first
time, so your guarantee is covered. And you'll notice that the casino won
money overall.

Just for curiosity, let's do the same analysis but you always bet 3/4 of your
bankroll:

HH: $100 -> $250 -> $625
HT: $100 -> $250 -> $062.5
TH: $100 -> $025 -> $062.5
TT: $100 -> $025 -> $006.25

Average ending bankroll is $756.25/4 = $189.0625.

your bankroll still continues to grow, but not as fast. This is, I assume, the
analysis that led to the Kelly percentage in the first place.

I'm prepared to bet money that there is not any finite betting strategy that
does not have a positive expectation. Even betting everything every time, you
will end up with all of the other guy's money often enough to cancel all the
times that you lose all of yours. It is the inverse of the roulette strategy,
below.

Zag

Zag's hot roulette strategy: Only play even money bets. Always bet $10 more
than the total you've lost so far. Quit as soon as you are ahead ($10) or
after ten losses in a row (ouch). Feel free to send me $5 if you win money the
first time you do this.

Stephen H. Landrum

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
Zagie wrote:
> I'm prepared to bet money that there is not any finite betting strategy that
> does not have a positive expectation.

Again you COMPLETELY missed the point.

The issue is not whether the bets have positive expectation, but
whether the typical result will be positive, and how much to bet
to maximize the typical result. The arithmetic average is
meaningless to most people if most outcomes leave you broke and
a very few leave you fabulously wealthy, unless you also happen
to like being broke.

Risk of ruin is a serious problem because now you have nothing
with which to take advantage of the many +EV opportunities that
are out there.

> Even betting everything every time, you
> will end up with all of the other guy's money often enough to cancel all the
> times that you lose all of yours. It is the inverse of the roulette strategy,
> below.

But most of the time you'll be flat busted. You'll end up with
all of his money EVERY time if you choose a better proportion of
your bankroll to bet. And the optimum fraction of your bankroll
will bankrupt your opponent on average in the shortest amount of
time.

> Zag's hot roulette strategy: Only play even money bets. Always bet $10 more
> than the total you've lost so far. Quit as soon as you are ahead ($10) or
> after ten losses in a row (ouch).

This is a different case because people want to avoid catastrophic
failures. Still, if you desperately needed $10 more than you had,
this method has a very high likelihood of getting you that $10.

Joesmallie

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
>From: Tom Weideman

>Suppose you are playing a game in which you have some sort of edge.

>"What


>fraction of my bankroll should I wager on this game?" I

>your


>answer is to bet "Kelly".

>To see how we can find such an answer, .............

Thankyou Tom. Simply marvelous.


Barbara Yoon

unread,
Jul 14, 1998, 3:00:00 AM7/14/98
to
Zagie:
> ...pairs of flips...complete the set:
> Starting with $100...betting half of bankroll:

> HH: $100 -> $200 -> $400
> HT: $100 -> $200 -> $100
> TH: $100 -> $050 -> $100
> TT: $100 -> $050 -> $025
> Add them up and divide by 4...156.25...
> ...same analysis but...bet 3/4 of your bankroll:

> HH: $100 -> $250 -> $625
> HT: $100 -> $250 -> $062.5
> TH: $100 -> $025 -> $062.5
> TT: $100 -> $025 -> $006.25
> Average ending bankroll is...$189.0625.

> your bankroll still continues to grow, but not as fast. This is, I assume,
> the analysis that led to the Kelly percentage in the first place.

I don't get it...how 189.0625 "not as fast" as 156.25?!

> I'm prepared to bet money that there is not any finite betting strategy

> that does not have a positive expectation. Even betting everything


> every time, you will end up with all of the other guy's money often
> enough to cancel all the times that you lose all of yours.

Of course, getting 2-to-1 odds on coin flips always will "have a positive
expectation" -- and "betting everything every time" will be the most such
"positive" of all... But in a race between you and me to "end up with all
of the other guy's money," YOU are going to end up broke virtually every
time, while I advantageously continue playing 25% of my money time
after time for as long as it takes until I win it all... See?!


Winner777

unread,
Jul 15, 1998, 3:00:00 AM7/15/98
to
P = Prob event is successful
R = the return that you get from $1 bet + your bet back, in this case you win
$2 and get $1 back for a total of $3.

The standard expectation equation is to always pretend that you are betting $1.

(P) (R) - 1

If you divide the product of the expectation equation by the odds to 1 you come
up with the correct mathematical bet.

(.5) (3) - 1 = .5 tells us that we have a 50% edge. In this case the odds to
1 are even so we should bet 50% of our BR every flip.

If you had a proposition where you only won 1 in 8 events and were recieving
odds of 10-1 your bet size has to become a lot smaller.

(.125) (11) - 1 = .375

the odds to 1 in this situation are 7 to 1.

therefore .375/7 = .0538

Ed Hill

Barbara Yoon

unread,
Jul 15, 1998, 3:00:00 AM7/15/98
to
Winner777:
> P = Prob event is successful. R = the return that you get from

> $1 bet + your bet back, in this case you win $2 and get $1 back
> for a total of $3. ...expectation equation...(P) (R) - 1. ...divide the
> product of the expectation equation by the odds to 1 [to get] the
> correct mathematical bet. (.5) (3) - 1 = .5...a 50% edge. ...the

> odds to 1 are even so we should bet 50% of our BR every flip.


Just a little slip there, Ed..."the odds to 1" are NOT "even" here, but
they are 2-to-1, and so dividing the 50% by 2, we bet 25% every flip...
OK?!


Stephen H. Landrum

unread,
Jul 15, 1998, 3:00:00 AM7/15/98
to
Winner777 wrote:
>
> P = Prob event is successful
> R = the return that you get from $1 bet + your bet back, in this case you win
> $2 and get $1 back for a total of $3.
>
> The standard expectation equation is to always pretend that you are betting $1.
>
> (P) (R) - 1
>
> If you divide the product of the expectation equation by the odds to 1 you come
> up with the correct mathematical bet.

You divide by the payoff odds wnen you win, not the odds of winning.

> (.5) (3) - 1 = .5 tells us that we have a 50% edge. In this case the odds to


> 1 are even so we should bet 50% of our BR every flip.

You divided by the wrong odds, the payoff is 2 to 1, so you should
bet 25% of your bankroll.

> If you had a proposition where you only won 1 in 8 events and were recieving
> odds of 10-1 your bet size has to become a lot smaller.
>
> (.125) (11) - 1 = .375
>
> the odds to 1 in this situation are 7 to 1.
>
> therefore .375/7 = .0538

Again, you divided by the wrong odds. The payoff is 10:1, so you
should bet .0375 of your bankroll.

Joesmallie

unread,
Jul 15, 1998, 3:00:00 AM7/15/98
to
>From: winn...@aol.com (Winner777)

>(.5) (3) - 1 = .5 tells us that we have a 50% edge. In this case the odds
>to
>1 are even so we should bet 50% of our BR every flip.
>
>

What happens if you lose three bets in a row?


Zagie

unread,
Jul 16, 1998, 3:00:00 AM7/16/98
to
In article <6ogr65$t0d$1...@winter.news.erols.com>, "Barbara Yoon"
<by...@erols.com> writes:

>I don't get it...how 189.0625 "not as fast" as 156.25?!

Because I compared to the wrong number?

>
>> I'm prepared to bet money that there is not any finite betting strategy
>> that does not have a positive expectation. Even betting everything
>> every time, you will end up with all of the other guy's money often
>> enough to cancel all the times that you lose all of yours.
>
>Of course, getting 2-to-1 odds on coin flips always will "have a positive
>expectation" -- and "betting everything every time" will be the most such
>"positive" of all... But in a race between you and me to "end up with all
>of the other guy's money," YOU are going to end up broke virtually every
>time, while I advantageously continue playing 25% of my money time
>after time for as long as it takes until I win it all... See?!

Absolutely. I agree completely. I was just trying to refute Stephen Landrum's
assertion that, if you bet more than 50% of your bankroll each time, your
bankroll would steadily go down. My claim was that any percent would tend to
increase your bankroll. The higher percents would have a higher risk of
busting out, but that risk is tiny until you get well above 50%. Even at 75%,
you have to lose seven in a row, less than a 1% chance (plus I'll add another
1% chance for the times when you win some but lose even more). Again, I'm not
advocating this approach, I'm just pointing out that your bankroll does NOT
tend to go down, as Stephen claimed.


In article <35AB826F...@pacbell.net>, "Stephen H. Landrum"
<slan...@pacbell.net> writes:


>Zagie wrote:
>> I'm prepared to bet money that there is not any finite betting strategy
>that
>> does not have a positive expectation.
>

>Again you COMPLETELY missed the point.
>

Well, sure, if you keep changing your point, it's easy to say that I've missed
it. The bulk of my note was correcting your *point* that your bankroll will
tend to stay even if you always bet 50% of your current bankroll, and that it
will tend to go down if you bet more. These are clearly wrong, as I have
shown.

I do agree that betting all of your bankroll each time is a bad idea, even if
it does have a positive EV. I never tried to advocate it, though it certainly
seems that I did when you take my comments out of context as you did.

Regards,
Zag

Stephen H. Landrum

unread,
Jul 16, 1998, 3:00:00 AM7/16/98
to
Zagie wrote:
> Absolutely. I agree completely. I was just trying to refute Stephen Landrum's
> assertion that, if you bet more than 50% of your bankroll each time, your
> bankroll would steadily go down.

Well, with probability one, it will if you continue betting this
proportion of your bankroll indefinitely.

You will also have positive expectation.

The positive expectation comes from the infinitesimal number of
times that your bankroll grows without bound.

Funny things happen in the long run.

> My claim was that any percent would tend to
> increase your bankroll. The higher percents would have a higher risk of
> busting out, but that risk is tiny until you get well above 50%. Even at 75%,
> you have to lose seven in a row, less than a 1% chance (plus I'll add another
> 1% chance for the times when you win some but lose even more).

No you don't have to lose 7 in a row, alternate wins and losses
also take your bankroll towards zero rather rapidly betting 75% of
your bankroll.

Start $100, bet 75, win, you have $250. Bet 187.50, lose, you
now have $62.50.

You lose 37.5% of your bankroll for every pair of win and loss.

You only come out ahead when you have greater than a 3:2 ratio
of wins to losses, even though every single bet is made with
positive expectation.

> Again, I'm not
> advocating this approach, I'm just pointing out that your bankroll does NOT
> tend to go down, as Stephen claimed.

The bankroll does tend to go down, because the win/loss ratio tends
toward its expected value.

With probability 1 you will eventually go bankrupt if there is a
point at which you cannot subdivide your money further.

I know that it's counterintuitive that it's both +EV and that the
eventual outcome is guaranteed bankruptcy.

Erik Reuter

unread,
Jul 16, 1998, 3:00:00 AM7/16/98
to

> I know that it's counterintuitive that it's both +EV and that the
> eventual outcome is guaranteed bankruptcy.

I think the part that is counterintuitive for many people is the part
about continuing to bet indefinitely. This sort of thing doens't happen in
real life :-) unless you are immortal and playing against some insane
person who has an infinite amount of money!

Last time Kelly's criterion came up I came up with this intuitive (at
least for me) way of phrasing it:

Suppose there is a gambling game with probability p of winning each time
and which gives the player an edge. If millions of people were to play
this game exactly N times each (each person starting with $100) and each
person bet the Kelly fraction, then you could tally up how many people
finished with each possible amount of money ($100, $120.32, $90.27, etc.).
Assume that N * p is an integer. Then one of these possible amounts will
have more people finishing with it than any other amount; call this amount
A. Betting any fraction other than the Kelly fraction would result in a
smaller value of A. In other words, betting the Kelly fraction maximizes
the most likely outcome for your final bankroll after N plays if N * p is
an integer. (If N * p is not an integer, then Kelly's criterion does NOT
necessarily maximize the most likely final bankroll)

--
Erik Reuter, e-re...@uiuc.edu

Larry Stone

unread,
Jul 16, 1998, 3:00:00 AM7/16/98
to
In article <e-reuter-160...@skywalker.ccsm.uiuc.edu>,
Erik Reuter <e-re...@uiuc.edu> wrote:

>I think the part that is counterintuitive for many people is the part
>about continuing to bet indefinitely. This sort of thing doens't happen in
>real life :-) unless you are immortal and playing against some insane
>person who has an infinite amount of money!

Infinity does funny things to things. We all know that a martingale
betting system (doubling up after a loss) is a loser in the long run. A
series of negative expectation bets must sum to a negative expectation.

Yet, counterintuitively, if you have an infinte bankroll, can find a table
with an upper bet limit of infinity (in other words, no maximum), and can
live long enough, you will make money with probability 1.0.

So how can these counterintuitve concepts be true? I don't claim this to
be any sort of rigorous proof but I think it's because infinity is really
just a concept and doesn't exist in the real world. Using the martingale
example, while taken to infinity you must win, for any finite number of
trials, there's a non-zero probability that you don't win. Infinity is a
great concept but there's no guarantee that we'll ge there anytime soon.

My martingale example is really just the opposite of what Steve is trying
to explain. Mine is negative expectation/guaranteed winner; his is
positive expectation/guaranteed loser.

--
-- Larry Stone
lst...@wwa.com
http://www.wwa.com/~lstone/

jac...@xmission.com

unread,
Jul 17, 1998, 3:00:00 AM7/17/98
to
"Stephen H. Landrum" <slan...@pacbell.net> writes:

> Zagie wrote:
> > Absolutely. I agree completely. I was just trying to refute Stephen Landrum's
> > assertion that, if you bet more than 50% of your bankroll each time, your
> > bankroll would steadily go down.
>
> Well, with probability one, it will if you continue betting this
> proportion of your bankroll indefinitely.

I've watched this for a while, and although I believe that Mr. Landrum
is mostly correct, I'm doing to play devil's advocate.

I think the two of you are arguing different aspects of the same
problem, with neither seeing the other's point. Both sides do have
a point, and I believe that both points deserve to be acknowleged.

> You will also have positive expectation.
>
> The positive expectation comes from the infinitesimal number of
> times that your bankroll grows without bound.
>
> Funny things happen in the long run.

Ah, but you can see some interesting things about this paradox in the
short run, if you view effects on a large number of independent players.

Imagine a zillion players who start with $100 and bet half their
bankrolls each time. After the first bet, about half of the players
will have $200, while the other half will have only $50. As a group,
they now have an average of $125, after only a single bet. After the
dust has settled from a second bet, we find that 1/4 of the players
have only $25, 1/2 have $100, and 1/4 now have $400. On average, the
players have $156.25. After a third be, we find:

1/8 have $12.50
3/8 have $50
3/8 have $200
1/8 have $800

For an average of $195.31.

So, as a group the players are winning a lot of money. They've
doubled their bankrolls (on average) after only three bets. There
is also a balance in the numbers of winners and losers. Half
the players are ahead, and half the players are behind. For every
player who has multiplied his/her bankroll by 8, there is a player
who has divided their bankroll by 8. That balance is what causes
the log(bankroll) to remain unchanged for the group, in spite of the
fact that the group as a whole are winning vast sums of money. All
it really means is that the geometric mean of the bankrolls stays
constant. That's what happens when the expected log(bankroll) is
zero.

However, the fact that the geometric mean is constant cannot really
be contrued as evidence that nobody is winning any money.

> > My claim was that any percent would tend to
> > increase your bankroll. The higher percents would have a higher risk of
> > busting out, but that risk is tiny until you get well above 50%. Even at 75%,
> > you have to lose seven in a row, less than a 1% chance (plus I'll add another
> > 1% chance for the times when you win some but lose even more).
>
> No you don't have to lose 7 in a row, alternate wins and losses
> also take your bankroll towards zero rather rapidly betting 75% of
> your bankroll.
>
> Start $100, bet 75, win, you have $250. Bet 187.50, lose, you
> now have $62.50.
>
> You lose 37.5% of your bankroll for every pair of win and loss.

True, but you are focusing only on what happens to the median player,
and this often leads to misunderstandings by the Kelly believers. In
the case of the 50% better, their bankroll stays the same if they
win & lose an equal number of bets. This hides the fact that those
who win more than half their bets will come out *way* ahead of those
that lose more than half their bets.

Overbetting doesn't turn the *group* into losing bettors, it merely
tends to distribute the winnings unevenly.

> You only come out ahead when you have greater than a 3:2 ratio
> of wins to losses, even though every single bet is made with
> positive expectation.
>
> > Again, I'm not
> > advocating this approach, I'm just pointing out that your bankroll does NOT
> > tend to go down, as Stephen claimed.
>
> The bankroll does tend to go down, because the win/loss ratio tends
> toward its expected value.

Not true. The entire bankroll of the group will always grow if the players
have an edge. The group bankroll will always grow faster if the group
bets a larger fraction of their bankroll. If is the skewed distribution
of funds that causes the geometric mean (and thus the Kelly utility) to
decrease.

All utility functions have their little quirks. With log(bankroll) utility,
if a single player loses everything, then the geometric mean will drop to
zero even if every other player is fabulously wealthy. With linear utility,
the arithmetic mean can be large if one player is Bill Gates and all the
other players are flat broke.

> With probability 1 you will eventually go bankrupt if there is a
> point at which you cannot subdivide your money further.

So? If you only bet 1.9 times the Kelly optimum, and only play as
a hobby, then you will have about a 50% chance of having massive
wealth after only a few bets.

> I know that it's counterintuitive that it's both +EV and that the
> eventual outcome is guaranteed bankruptcy.

It is equally couterintuitive (but true) that the sum total bankroll
of the overbetting players will *always* exceed the sum total of any
group that bets less. It is a tradeoff, and this argument is IMO
so focused on the extreme cases that interesting effects in the middle
ground are being ignored.

This is a religous battle. You can conjure similar tastes-great/less-filling
arguments with *any* two utility functions. If you look at the arguments
closely, you'll find that they ultimately come down to this:

1) If you use utility function A to determine optimal betting as well
as final results, then utility function B will not fare as well.
It doesn't matter which A/B you pick.

2) If you use utility function B to determine optimal betting as well
as final results, then utility function A will not fare as well.

What Kelly betting *really* does is optimize the outcome for log(bankroll)
optimized bets. Nothing more, nothing less. Kelly betting is the absolute
best way to maximize Kelly's measure of utility. This sounds really
impressive, until you realize that betting based on ANY utility function
is the absolute best way to maximize the utility for that function. This
isn't amazing, it is merely tautological.

Gamblers seem to go through stages:

1) Expectation is everything.

2) Expectation isn't everything (or else you'd "bet it all" every time
you tripped over a favorable situation).

3) Kelly is everything.

A lot of "enlightened" gamblers stop there, having trading in their
EV bigotry for a newfound log(bankroll) bigotry.

The final steps:

4) Kelly isn't everything.

5) Utility functions cannot be compared. No utility function is "better"
than any other, they are simply different ways of measuring the goodness
of your overall outcome.

If you want to give up a little Kelly utility in order to have your mean
bankroll grow faster, go ahead and do it -- it will really piss of the
Kelly purists :-)
--
Steve Jacobs (jac...@xmission.com) \ Do you play Video Poker? Try VP Freebie
"Expectation isn't everything..." \ http://www.conjelco.com/vpfreebie.html

zugz...@dcn.davis.ca.us

unread,
Jul 17, 1998, 3:00:00 AM7/17/98
to
In article <jacobspvf...@xmission.com>,
jac...@xmission.com wrote:

<very nice explanation of utility and difference between linear vs. geometric
maximization of growth rate snipped>

Nice job, Steve. That should end the feud.


Tom Weideman

-----== Posted via Deja News, The Leader in Internet Discussion ==-----
http://www.dejanews.com/rg_mkgrp.xp Create Your Own Free Member Forum

William Chen

unread,
Jul 19, 1998, 3:00:00 AM7/19/98
to
In article <35AE458B...@pacbell.net>,

"Stephen H. Landrum" <slan...@pacbell.net> wrote:

>Zagie wrote:
>> Absolutely. I agree completely. I was just trying to refute Stephen
Landrum's
>> assertion that, if you bet more than 50% of your bankroll each time, your
>> bankroll would steadily go down.
>
>Well, with probability one, it will if you continue betting this
>proportion of your bankroll indefinitely.
>

>You will also have positive expectation.
>
>The positive expectation comes from the infinitesimal number of
>times that your bankroll grows without bound.
>
>Funny things happen in the long run.

Actually this isn't really all that surprising (aorry about comming in late in
the discussion. It's the difference between expected raturn which is an
average and *median* return. Suppose everyone is California puts up 20K for a
lottery, and the winner gets a trillion dollars. So basically over time the
median return goes to zero but the expecttion increases because the maximum
return grows exponentially while the probability of acheiving the max return
is decreasing exponentially but at a slower rate.

So there are stupid betting atrategies that almost insure that you lose all of
your money even though you have positive expectation for all of your bets, we
shouldn't really be surprised by this since our adversary has an infinite
bankroll. Too bad there aren't really clever ways of betting with a negative
expectation to realize growth of your bankroll--it seems that their ought to
be given the stupid betting strategies, but that's noot the way it works. So
wait a minute, does this mean that money management is more important than we
give it credit for?

I actually believe so. What we are trying to accomplish ultimmately is an
exponential growth of our bankrolls.

I believe that some players go broke by "going up the ladder" faster than
their currecnt bankroll justifies. Even though they may have +EV in all of
the games they play--they may have unrealistic expectations of future
performance based on past performance. A bad session at 100-200 may wipe out
all of last year's play at 6-12. I am repeatedly surprised when a player who
I consider to be a better poker player than I am tells me that they are near
broke or running bad and is looking for a temporary prop job or soft low-limit
game to buld their bankrolls. (I think this type of thing is really tough to
admit and most of the players wouldn't say it unless it were true).

Conversely, it may not make sense to play at a game where your rate of return
is low compared to other games/activities even if the players are essentially
giving away money and the risk is small. Maybe this will be the case for me
at my regular game (6-12 stud at the oaks). In most activities you can
experience some exponential growth for a while then the game changes--but
different avenues may open up. For billionairs the only game big enough may
be the stock market to get a good percentage return of their bankroll.
Suppose we suspend disbelief and assume Mike Caro was the best poker player on
the planet and furthermore had a significant 1BB advantage over anyone in the
world. If his bankroll were 10 billion (1/5 of Bill Gates's) well--there just
aren't too many 1million-2million hold'em games.

So what'a reasonable timeframe for a bankroll double at poker?


Bill

Mike Caro

unread,
Jul 19, 1998, 3:00:00 AM7/19/98
to
On Sun, 19 Jul 1998 11:46:05 GMT, William Chen <w_c...@ix.netcom.com>
wrote:

>Suppose we suspend disbelief and assume Mike Caro was the best poker player on
>the planet and furthermore had a significant 1BB advantage over anyone in the
>world.

Bill --

I can't believe you said that. One would not need to "suspend
disbelief" to assume this to be true. That one big bet per hour is, in
fact, the approximate margin of superiority that I enjoy over the
next-best player in the world.

As I've pointed out many times, this margin of superiority is not
calculated using error-prone statistical sampling. Rather it is based
on an actual estimate.

Straight Flushes,
Mike Caro

0 new messages