Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Tournament coin toss question

9 views
Skip to first unread message

Patti Beadles

unread,
Dec 29, 2001, 5:47:56 AM12/29/01
to
It's the first hand of a nolimit tournament in which there is a
reasonable amount of play, and you feel that you have a skill edge.
you have the opportunity [1] to take a coin toss to either double
up or bust out.

Do you believe it's mathematically correct, incorrect, or neutral to
play the coin toss hand? Why?

-Patti

[1] Yes, I know there are no perfect coin tosses, and you can't know
that your exact percentages in any case, yada yada yada. Think of it
as the poker equivalent of a theoretical frictionless surface.
--
Patti Beadles |
pat...@gammon.com |
http://www.gammon.com/ |
or just yell, "Hey, Patti!" | Quisque comoedus est.

Jd00123

unread,
Dec 29, 2001, 6:36:38 AM12/29/01
to
>From: Patti Beadles

>It's the first hand of a nolimit tournament in which there is a
>reasonable amount of play, and you feel that you have a skill edge.
>you have the opportunity [1] to take a coin toss to either double
>up or bust out.
>
>Do you believe it's mathematically correct, incorrect, or neutral to
>play the coin toss hand? Why?
>

If you believe you have a skill edge,why should you take a 50-50 and very
possibly knock yourself out?Give your skill a chance to work.

Barbara Yoon

unread,
Dec 29, 2001, 9:25:30 AM12/29/01
to
Patti Beadles:

> It's the first hand of a nolimit tournament in which there is a reasonable
> amount of play, and you feel that you have a skill edge. you have the
> opportunity to take a coin toss to either double up or bust out.

> Do you believe it's mathematically correct, incorrect, or neutral to
> play the coin toss hand? Why?

It would be mathematically wrong for you to take such a coin-toss
(even if 'equal skill,' without your "skill edge"), because in tournaments
(other than 'winner-take-all,' that is), each additional chip in your stack is
worth less to you (in terms of expected prize money) than the previous
chip, and so the chips that you would be risking are more valuable to you
than the equal number of chips that you could gain...OK?!

Gregory Raymer

unread,
Dec 29, 2001, 10:07:15 AM12/29/01
to
Barbara is correct, and not. It is true that the chip you win is worth
slightly less than the chip you lose. However, early in a tourney, this
factor is so negligible that it can be ignored. When you get into or near
the money, this factor becomes quite significant and needs to be factored
into your decisions.

Later, Greg Raymer (FossilMan)

Barbara Yoon <by...@erols.com> wrote in message
news:a0kjs1$5a1$1...@bob.news.rcn.net...

Gregory Raymer

unread,
Dec 29, 2001, 10:09:39 AM12/29/01
to
Mathematically, it's pretty much neutral. Of course, when we're doing a
math analysis, we assume all players are equal. Since we're told you're
better than equal, this play becomes a small mistake.

I think the real deciding factor is how much, if any, does your edge over
the field increase if you immediately become chip leader. In other words,
if you were 15% better before the first hand was dealt, will you become 25%
better with the biggest stack? If you will remain at 15%, then I would fold
this coin-toss hand.

Later, Greg Raymer (FossilMan)
Patti Beadles <pat...@rahul.net> wrote in message
news:a0k70s$fue$1...@samba.rahul.net...

Freddy Flares

unread,
Dec 29, 2001, 10:26:42 AM12/29/01
to
One law of physics states that matter cannot be created nor destroyed. So
if the winner of the hand wins less in value than the loser loses, who is
benefiting from this?

Hang on I think I see it now, the rest of the field benefit because they
move closer to being in the money.

Barbara Yoon

unread,
Dec 29, 2001, 10:28:40 AM12/29/01
to
Freddy Flares:

E U R E K A ...!!

bobf

unread,
Dec 29, 2001, 10:33:27 AM12/29/01
to
Does anyone know of a rigorous mathematical proof of the commonly-held
belief that each tournament chip won is worth slightly less than each one
lost? It seems sensible intuitively and qualitatively, but I've never seen
it proven quantitatively.

"Patti Beadles" <pat...@rahul.net> wrote in message
news:a0k70s$fue$1...@samba.rahul.net...

Lee Munzer

unread,
Dec 29, 2001, 12:34:43 PM12/29/01
to

"Gregory Raymer" ...

> Mathematically, it's pretty much neutral. Of course, when we're doing a
> math analysis, we assume all players are equal. Since we're told you're
> better than equal, this play becomes a small mistake.
>
> I think the real deciding factor is how much, if any, does your edge over
> the field increase if you immediately become chip leader. In other words,
> if you were 15% better before the first hand was dealt, will you become
25%
> better with the biggest stack? If you will remain at 15%, then I would
fold
> this coin-toss hand.
>
> Later, Greg Raymer (FossilMan)

My answer will go against the grain and does not consider
favorable/unfavorable blind and ante structures in specific events ("a
reasonable amount of play" is fine for this exercise).

I understand Barbara's mathematical analogy and statement on relative chip
value (I'll even throw in the all-in value that will diminish if you were to
accept the coin flip). That said, Patti specified no-limit, thus if I were
playing a no rebuy five hour duration NL HE event, I'd accept the outcome of
the toss. Alas, while I believe I understand the pros and cons, I can't
quantify the pro side. The chip leader fringe benefits (mainly moving
players off hands) is enough for me. Besides, I figure I rate to win 54% of
ther flips by calling tails, right?

Lee

Reraise1

unread,
Dec 29, 2001, 2:40:16 PM12/29/01
to
I understand the YOONER...and I'm not sure I would risk all my chips on a
50/50/ish proposition on the first hand of a tournament...however....since this
is no limit I'm not sure that the argument of worth of chips is relevant at
this point of the event in relation to chip worth at the end of the
event..)i.e) a chop of the event monies...Like they say in order to win you
have to win with AK and beat AK ( with a lesser hand)...in terms of
skill...lets look at if you did take the gamble and won...and now used your
"skill as Patti alludes) those chips in terms of tournament play are now worth
alot more towards "wining the event or accumulating enough chips to pose a
treat to the majority of players with less chips...than not having them...you
can bet second pairs strongly with them and if u lose ...who cares...your
trying to increase you position in order to move up....your playing
poker....math is math however...poker (no limit) is as well poker...
The question is more of a tournament strategy question...and the answer there
would be...the math is not strongly in your favor to risk all your chips on
basically a 50/50 proposition...you cant get more chips if u lose ...your
out...studying math

JP Massar

unread,
Dec 29, 2001, 3:08:03 PM12/29/01
to
On Sat, 29 Dec 2001 07:33:27 -0800, "bobf"
<robertf...@email.msn.com> wrote:

>Does anyone know of a rigorous mathematical proof of the commonly-held
>belief that each tournament chip won is worth slightly less than each one
>lost? It seems sensible intuitively and qualitatively, but I've never seen
>it proven quantitatively.
>

All the chips, when possessed by the collection of players, are worth
the prize pool.

All the chips, when possessed by a single player, are only worth first
prize, which is usually around 40% of the prize pool.

Since you start with 1/N of the chips and 1/N of the equity (N = # of
players entering the tournament), all else being equal, you cannot
arrive at the limit of all the chips being worth only a fraction of
the prize pool without the value of additional chips being worth less
than value of the chips you currently possess.

QED

Patti Beadles

unread,
Dec 29, 2001, 3:36:00 PM12/29/01
to
In article <e$eFMpIkBHA.1232@cpimsnntpa03>,

bobf <robertf...@email.msn.com> wrote:
>Does anyone know of a rigorous mathematical proof of the commonly-held
>belief that each tournament chip won is worth slightly less than each one
>lost? It seems sensible intuitively and qualitatively, but I've never seen
>it proven quantitatively.

Pick your favorite tournament equity formula and do the math for
different stack sizes, and you'll see it.

Or here's a fairly easy "proof"-- You buy in to a $100 tourney and get
T100 in chips. There are 100 players, and first prize is $4000.

Clearly, at the start of the tournament, each chip in your stack has a
value of $1. At the end of the tournament, each chip in your stack is
worth $.40.

-Patti
--
Patti Beadles |
pat...@gammon.com |
http://www.gammon.com/ | The crazy chick with
or just yell, "Hey, Patti!" | the purple hair.

bobf

unread,
Dec 29, 2001, 4:44:23 PM12/29/01
to
Thank you to JPMassar and Patti Beadles for responding to my question in
this thread. I still seem to be missing something though. Say you double
up on the first hand of Patti's 40 person, $100 buy-in tournament. It seems
to me that your equity has gone from $100 to $200--ie, that each aditional
chip is worth exactly $1. Why is this not the case? All else being equal,
your equity must be exactly twice that of each of the 38 other remaining
players, mustn't it?


"Patti Beadles" <pat...@rahul.net> wrote in message
news:a0k70s$fue$1...@samba.rahul.net...

Randy Hudson

unread,
Dec 29, 2001, 6:52:18 PM12/29/01
to
In article <eJrEI6LkBHA.1236@cpimsnntpa03>,
bobf <robertf...@email.msn.com> wrote:

> Say you double up on the first hand of Patti's 40 person, $100 buy-in
> tournament. It seems to me that your equity has gone from $100 to
> $200--ie, that each aditional chip is worth exactly $1.

Since at the end the prize money will not be paid based on how many chips
you have, but rather based on when you busted out, the marginal malue of
chips is a declining function of stack size.

Furthermore, Patty stipulated that your expectation was above-average; that
is, your starting equity exceeded $100.

So doubling your stack less than doubles your tournament equity.

Yet, that doesn't make it wrong to go for it, if there are other
opportunities for positive-equity play. Suppose you see a single-table
satellite forming with a field even weaker than your current tournament.
That creates a safety net; if you bust out of this tournament, you can jump
into that satellite with good positive expectation. In that case, losing
all your chips isn't losing all your equity.

If your own time is valuable, perhaps busting out immediately and freeing up
an expected couple hours of your time is worth a fair amount. Again, while
doubling up doesn't double your current equity, this safety net means that
busting out doesn't lose all your equity either.

Similarly, if this is a rebuy tournament, and you view your marginal equity
from doubling up as worth more than the cost of a rebuy, it's correct to
gamble and toss the coin - heads, you double up, and gain more than a rebuy
in equity; tails, bust, rebuy, and have about the same equity as before
(your share of chips is slightly smaller than before, but your rebuy makes
the prize pool bigger; overall, your equity after the rebuy will be a tiny
bit greater than before losing a rebuy stack in a place-paying tournament
unless they are charging an entry fee on rebuys or equivalently raking a
percentage of the prize pool.)

--
Randy Hudson <i...@panix.com>



Rob Oldfield

unread,
Dec 29, 2001, 7:50:08 PM12/29/01
to
...surely 54.2%? Please try and get your facts correct.


Lee Munzer wrote in message ...

Pete Watts

unread,
Dec 29, 2001, 9:47:47 PM12/29/01
to
A great pleasure to see this topic discussed, even though the fewer
ppl know the correct answer, the better it is for me. #however, I
didnt really know the correct answer myself - while it is true that
your equity from doubling up is not quite doubled, I had not clicked
that this difference is too marginal to have practical relevance, and
that the call is effectively equal. Having just bombed out fifth in a
top four prized tourney lasting four hours I was strongly reminded of
a principle that I had already guessed, namely that last out before
the money is the worst possible result, and that first out is actually
the very best possible result other than being in the money. The
reason is that first out can promptly go off and play in a different
game and make money there, whereas the last off the money players has
put in hours, or even days, for nothing. This asumes of course that
your a good player who generally does win and that there is another
game to go to, and also ignores the fact that you might* enjoy*
playing those hours, whereas going out first makes you feel a chump
and youve wasted your money. Of course, last off the money is deeply
frustrating, and perhaps therefore SECOND last off the money is better
for not spoiling your day so much!

This is interresting because the general received wisdom appears to be
that you should play tighter in tourneys, or at very least to try to
avoid all-ins. The reasonm is that even if you have 1/2 chance of
doubling your equity with the all-in, if you survive and do the same
thing later your chsances of survivng are now only .5*.5 or .25; do
it again and you go down to .125 . However, by increasing your
equity you also make it *less likely* that you will be taken all-in
again, and hence this argument too may be flawed. Also, mindful of
the reluctance of ppl to call all-in, you can exploit this by going
all-in yourself with a raise. A while back I saw a guy go All In with
45s, and was called nearly All In by a guy with AA, and was promptly
rewarded by seeing the 45s guy hit his flush. If it aint right to
call with AA, it aint right to call period. If you check the stats AA
is only abt an 80:20 favorite over 45s, and of course normally uoid
expect he would have had something a lot bigger. Ironically if he had
AKs the AA guy's chances would have been better.

I have been experimenting with overbets recdently and my reward was to
see a guy go all-in against my KK, and ~I called to see his AA hold up
and take me out. Of course, I really should have known that the
"impossible" had happened and he had the AA. But Im no Phil Helmuth,
and didnt fold. The problem with overbets is that though they get rid
of drawers, it means the *only* calls you are likely to get are
monsters who will reraise and beat you, so you either win a small pot
or lose a big one. This makes the theory rather complex and I'll have
to do a bit more work on it. On that note, someone mentioned their
"favorite equity formula". Ive never seen one - can someone tell me
one???

PW/BluffKingHal online

Barbara Yoon

unread,
Dec 29, 2001, 9:49:36 PM12/29/01
to
bobf:

> Does anyone know of a rigorous mathematical proof of the
> commonly-held belief that each tournament chip won is worth
> slightly less than each one lost? It seems sensible intuitively
> and qualitatively, but I've never seen it proven quantitatively.

JP, Patti, and Randy have all answered your question in their different ways...
I myself tend to understand complicated things like this by relying more on
concrete examples... So here, let's say that you're one of the final three players,
with equal chip stacks, competing for 1st, 2nd, 3rd prizes of $1,200, $900, $600
-- at this point, your stack is worth a prize expectation of $900 (1/3rd of the
total $2,700 prize pool -- right?!). And if you LOSE your stack, you get the
$600 3rd prize, for a loss of $300 from your initial $900 expectation. But if
you WIN, knocking out one of your opponents, and DOUBLING your stack,
your prize money expectation grows to $1,100 (as you would then have 2/3rds
of the chips, competing for the 1st, 2nd prizes of $1,200, $900 -- right?!), for
a gain of only $200 from your initial $900 expectation. So losing your stack
would cost you $300, but winning another stack of equal size would gain you
only $200. I hope we've been able to clarify this complicated issue here...OK?!

Dsklansky

unread,
Dec 30, 2001, 1:27:04 AM12/30/01
to
>Barbara is correct, and not. It is true that the chip you win is worth
>slightly less than the chip you lose. However, early in a tourney, this
>factor is so negligible that it can be ignored. When you get into or near
>the money, this factor becomes quite significant and needs to be factored

>into your decisions.

>Later, Greg Raymer (FossilMan)
>
>

But the original poster said you had a skill edge. THAT is the main reason a
coin toss would be very wrong, even at the beginning of the tournament.

Dsklansky

unread,
Dec 30, 2001, 1:32:05 AM12/30/01
to
The only good argument for the coin flip by the good player is the hourly rate
one. Period.

TadPerry01

unread,
Dec 30, 2001, 5:32:45 AM12/30/01
to
dsklansky comments:

So are we saying that it's possible to be both ahead and behind?

I play in NL tournaments occasionally. I am good at knowing when I'm ahead, but
it's not always by a large margin. Am I actually WRONG to press things when the
edge is marginal?

Separately, what percentage edge do you need to go all-in on that first hand?
If 50-50 isn't enough, is 60-40? Where's the line? How does one figure it?

tvp

Barbara Yoon

unread,
Dec 30, 2001, 1:09:55 PM12/30/01
to
Patti Beadles:
>>> It's the first hand of a nolimit tournament in which there is a reasonable
>>> amount of play, and you feel that you have a skill edge. you have the
>>> opportunity to take a coin toss to either double up or bust out.
>>> Do you believe it's mathematically correct, incorrect, or neutral to
>>> play the coin toss hand? Why?

>> It would be mathematically wrong for you to take such a coin-toss
>> (even if 'equal skill,' without your "skill edge"), because in tournaments
>> (other than 'winner-take-all,' that is), each additional chip in your stack is
>> worth less to you (in terms of expected prize money) than the previous
>> chip, and so the chips that you would be risking are more valuable to you
>> than the equal number of chips that you could gain...OK?!

Greg Raymer (FossilMan):


>> Barbara is correct, and not. It is true that the chip you win is worth
>> slightly less than the chip you lose. However, early in a tourney, this
>> factor is so negligible that it can be ignored. When you get into or near
>> the money, this factor becomes quite significant and needs to be factored
>> into your decisions.

Dsklansky:


>> But the original poster said you had a skill edge. THAT is the main reason
>> a coin toss would be very wrong, even at the beginning of the tournament.

Amusing how here in 'contentious' RGP, three responders to Patti's question
can superficially appear to be 'arguing' with each other, when in fact there is
absolutely no real conflict at all between them -- just slightly different focuses...

Barbara Yoon

unread,
Dec 30, 2001, 1:52:31 PM12/30/01
to
Patti Beadles:
>>> It's the first hand of a nolimit tournament... ...you have the opportunity

>>> to take a coin toss to either double up or bust out.

>> ...mathematically wrong...because...each additional chip in your stack is
>> worth less to you than the previous chip...

Greg Raymer (FossilMan):


>> However, early in a tourney, this factor is so negligible that it can be ignored.

TadPerry01:
> [So then] what percentage edge do you need to go all-in on that first hand?


> If 50-50 isn't enough, is 60-40? Where's the line? How does one figure it?

"It depends"...on how many players, the prize structure, relative skill
assumptions, and such -- but typically, assuming equal skill, I suspect
that as "FossilMan" suggests, it's likely closer to your "50-50"...OK?!

Patti Beadles

unread,
Dec 31, 2001, 4:25:23 AM12/31/01
to
OK, here's more of the question, and the part where it gets tricky.
Let's quantify it.

Let's assume that it's incorrect for you to take the coin toss if you
have both a skill advantage, and the luxury of time to use it. (I'd
probably take the coin toss in a single table satellite with short
rounds, for example, since I wouldn't have as much time for my skill
edge to be a major factor. I wouldn't in a large tournament with a
relatively weak field.)

The key concept, I think, is that if you take the coin toss, you're
completely nullifying your skill edge 50% of the time.

Would you take 51/49 on the first hand? 55/45? Where do you draw the
line. It seems to me like you should be able to calculate that point
based on your skill edge, but how?

-Patti
--
Patti Beadles |
pat...@gammon.com |
http://www.gammon.com/ | Dammit! I've got a date tonight
or just yell, "Hey, Patti!" | and I can't find my rope.

Barbara Yoon

unread,
Dec 31, 2001, 12:07:36 PM12/31/01
to
Patti Beadles:
> [on advisability of going with 'double-or-nothing' coin-toss at beginning
> of tournament] OK, here's...where it gets tricky. Let's quantify it.
> ... Would you take 51/49 on the first hand? 55/45? Where do you

> draw the line. It seems to me like you should be able to calculate
> that point based on your skill edge, but how?

Your "skill edge" notion here makes your already difficult question even
more complicated... But no matter what, the answer depends on the
PRIZE STRUCTURE -- for examples, assuming EQUAL skill, in
a TEN-player tournament, if it were 'winner-take-all,' then you could
advantageously go for the 'coin-toss' with anything better than "50/50;"
but if the prizes were seats in 'The Big One' for the top NINE finishers
(and nothing for tenth), then you would need "90/10"...see?!

bobf

unread,
Dec 31, 2001, 3:10:24 PM12/31/01
to
One way might be to calculate, in percentage terms, how many more final
tables (or top three finishes, or whatever) you've had than chance alone
would yield. That percent, or something close to it, would seem to be the
edge you'd want.

In reality, I wonder whether anyone has played enough tournaments and kept
accurate enough records to get a statistically significant result.


"Patti Beadles" <pat...@rahul.net> wrote in message

news:a0pau3$gaf$1...@samba.rahul.net...

Barbara Yoon

unread,
Dec 31, 2001, 3:46:12 PM12/31/01
to
Patti Beadles:
>>> [on advisability of going with 'double-or-nothing' coin-toss at beginning of
>>> tournament] Would you take 51/49 on the first hand? 55/45? Where do
>>> you draw the line. It seems to me like you should be able to calculate...

>> ...depends on the PRIZE STRUCTURE -- for examples, assuming


>> EQUAL skill, in a TEN-player tournament, if it were 'winner-take-all,'
>> then you could advantageously go for the 'coin-toss' with anything better
>> than "50/50;" but if the prizes were seats in 'The Big One' for the top
>> NINE finishers (and nothing for tenth), then you would need "90/10"...

The two examples given above are what you could call 'boundary limits'
-- that is, the real answer for more 'typical' prize structure lies somewhere
in between "50/50" and "90/10"...OK?!

Dsklansky

unread,
Dec 31, 2001, 11:43:24 PM12/31/01
to
Neglecting hourly rate considerations, the answer should be pretty close to
your assessment of your chances of doubling up vesus going broke.

JonCooke

unread,
Jan 2, 2002, 7:28:21 AM1/2/02
to
dskl...@aol.com (Dsklansky) wrote in message news:<20011230013205...@mb-fi.aol.com>...

> The only good argument for the coin flip by the good player is the hourly rate
> one. Period.

It's not just the size of your edge; it's how the edge manifests
itself:

I like Lee's point: If the table is risk averse (playing tighter than
optimum v big stacks), a big stack may have additional value early on.
Just before the final table there is too little calling/too much
stealing in my opinion. Being a big stack here can certainly have
mathematically unwarranted value...

Personally, I tend to avoid the close ones early. I find my edge comes
in the middle to latter stages of play. If I'm still around I do well.
I think the image created by playing tight early helps me to steal
more later on, when the rewards tend to be greater.

Quite a complex issue though and more complex than DS says: If you
play well early with a big stack and struggle with average chips,
maybe you're better taking the 50/50.

Alan Bostick

unread,
Jan 2, 2002, 2:38:08 PM1/2/02
to
Patti Beadles <pat...@rahul.net> wrote in message news:<a0k70s$fue$1...@samba.rahul.net>...

> It's the first hand of a nolimit tournament in which there is a
> reasonable amount of play, and you feel that you have a skill edge.
> you have the opportunity [1] to take a coin toss to either double
> up or bust out.
>
> Do you believe it's mathematically correct, incorrect, or neutral to
> play the coin toss hand? Why?
>
> -Patti
>
> [1] Yes, I know there are no perfect coin tosses, and you can't know
> that your exact percentages in any case, yada yada yada. Think of it
> as the poker equivalent of a theoretical frictionless surface.

What is the value of my time, for purposes of this discussion?

If I bust out on the first hand, I lose all my equity in the tournament
prize pool. If I double up, my equity in the prize pool almost-but-not-quite
doubles, and the length of time I expect to continue playing in the
tournament increases. If I bust out, my prize equity vanishes, but I
now have several more hours of time to use than I did before.

If the tournament buyin is on the order of $100, and there's a juicy $15-$30
game going on, and my mid-limit win rate is between one and two bets per
hour, then gambling on the first hand might well be correct! If I win the
hand, my tournament equity doubles; if I lose the hand, the expected win
from the extra time in the $15-$30 game replaces the lost tournament equity.

Alan Bostick

unread,
Jan 2, 2002, 2:56:55 PM1/2/02
to
I should probably add that if my live game win rate is such that I *don't*
lose EV by gambling on the coin toss, my decision to play in the tournament
in the first place could well have been a poor one.

Henry Estes

unread,
Jan 2, 2002, 3:08:25 PM1/2/02
to
"On 2 Jan 2002 11:56:55 -0800, in article
<6c8f54bf.02010...@posting.google.com>, Alan Bostick wrote..."

>
>I should probably add that if my live game win rate is such that I *don't*
>lose EV by gambling on the coin toss, my decision to play in the tournament
>in the first place could well have been a poor one.

I was wondering this but decided "juicy" was an important factor in your
decision.


Bill chen

unread,
Jan 3, 2002, 6:46:44 AM1/3/02
to
dskl...@aol.com (Dsklansky) wrote in message news:<20011231234324...@mb-fi.aol.com>...

> Neglecting hourly rate considerations, the answer should be pretty close to
> your assessment of your chances of doubling up vesus going broke.

Except that if you are a skilled player, you increase your winning
chances by doubling up immediately rather than later. This is because
you should have some postive expectation for your play. Suppose you
have 2000 and the next round is an hour of 50-100. Then if you are a
1 big bet/hr winner, having 2000 at the beginning of the round should
be (to borrow David's phrase) pretty close to having 2100 at the end
of the round. Also because of the same effect, P(doubling up) is
greater earlier and less later.

This further complicates the question. In David's model if you assume
you have twice the average chance of winning in an 128 player field,
your chances of doubling up are roughly 55% on average. It's a
reasonable assumption to say P(doubling) is 55% if you are an average
stack, less if less than average. So say you are average with 1000 at
the beginning of the 50-100 round, and a genie (or some other plot
device) gives you the chance to double up immediately, what odds do
you need? Well, if you figure it will take you a round on average to
double up, which seems reasonable since the variance per hour round is
10 big bets. If you win you will have 2000, which we have said to be
worth 2100 one round later, as opposed to taking a 55% chance to have
2000 one round later. Hence you should have a 52.5% chance of winning
the coin flip.

Note that you only need a 50.6:49.4 edge to risk half your chips.
Now, many assumptions are made here and many more left out that should
be considered, but I've tried a few different models and I have gotten
similar results. What does this mean to me in practice? Well in
general I think I am a pretty quantative guy, but I can't really say
"well base on my read of the range of hands my opponent(s) have
including implied odds, I think I have a 1% edge here but since this
is a tournament situation I can now fold!" "But wait a minute that
15-30 game there seems juicy. Let me see, based on the players there
my win rate is $30/hr. So like time value converted to tourney chips
is."

Seriously though, don't sweat it too much until it gets close to the
money. At the beginning of a tourney just put in your chips if you
have an edge, fold if it just seems like a toss up. Not bad advice for
a ring game either.

Dsklansky

unread,
Jan 3, 2002, 1:37:15 PM1/3/02
to
>> Neglecting hourly rate considerations, the answer should be pretty close to
>> your assessment of your chances of doubling up vesus going broke.
>
>Except that if you are a skilled player, you increase your winning
>chances by doubling up immediately rather than later.

Yes. That is why I said pretty close.


jacksup

unread,
Jan 3, 2002, 2:28:18 PM1/3/02
to
Nice post, Bill. Just one question:

> In David's model if you assume
> you have twice the average chance of winning in an 128 player field,
> your chances of doubling up are roughly 55% on average.

Could you explain how you arrived at this figure? It's not obvious to
me, and the 55% is crucial to your conclusions so I want to be sure to
understand how you got it.

Thanks a lot,
Matt

Bill chen

unread,
Jan 3, 2002, 4:33:03 PM1/3/02
to
mattm...@hotmail.com (jacksup) wrote in message news:<473b8d61.02010...@posting.google.com>...


Well you need to double up 7 times to win. So say your chance of
winning is 1/64 instead of 1/128. LEt x be your chance of doubling
up. Then we have x^7 = 1/64. So we solve for x.

Bill

NOSPA...@sebastian9.com

unread,
Jan 3, 2002, 10:00:21 PM1/3/02
to

According to JP Massar <mas...@alum.mit.edu>:

But that's not a proof of the question as stated, and certainly not
rigorous. All that you've shown is that the accumulation of *all*
the remaining chips is worth less, on a per-chip basis, than your
original stack (and that only for place-paying tournaments, where
all players are equally skilled). In order to show that *each* chip
won is worth progressively less, you would at least need to show
that the marginal value of additional chips is monotonic with
respect to the number of chips in your stack at all stages of the
tournament, and you haven't come close to showing that yet. For example,
it might well be the case that going from 1/N to 2/N of the chips early
on more than doubles your chance of winning, but the difference between
70% and 90% of the chips is negligible if everyone else is still at 1/N.

Such a situation might arise if players are typically giving up
significant chip EV to avoid the risk of busting out. Then the first
few chips you win early might add significant value in letting you
exploit more marginal situations, but beyond that the utility might drop off.
So it might even be the case that it's right to push marginal opportunities
early if and only if most of your opponents think it's wrong.

--
Dave Wallace (Remove NOSPAM from my address to email me)
It is quite humbling to realize that the storage occupied by the longest
line from a typical Usenet posting is sufficient to provide a state space
so vast that all the computation power in the world can not conquer it.

jacksup

unread,
Jan 7, 2002, 6:35:35 PM1/7/02
to
Bill,

Thanks for explaining the 55%. But wait a minute, I have a new
question.

> Well, if you figure it will take you a round on average to
> double up, which seems reasonable since the variance per hour round is
> 10 big bets. If you win you will have 2000, which we have said to be
> worth 2100 one round later, as opposed to taking a 55% chance to have
> 2000 one round later. Hence you should have a 52.5% chance of winning
> the coin flip.

Except that if I lose the coin flip, I'm out of chips. This will
happen 47.5% of the time. In the case where I decline the coin flip,
55% of the time I will have 2000 one round later, but the other 45% of
the time, I will usually still have chips, right? I would think there
are many cases where a good player does not double up, but at least
maintains his chip count. And that has to increase his overall
tournament EV. Therefore I would think you would need significantly
better than a 52.5% chance to take the coin flip.

What do you think? Are there some flaws in this reasoning?

Thanks,
Matt

0 new messages