You have $X but need $Y. (I don't think X and Y matter much; let X = 100,
Y = 200 if you like.) Your only option is a roulette wheel, with American
rules. American roulette allows various bets, all with identical "house
commissions" but different pay-offs, 1-1, 2-1, 35-1, etc. (Indeed any
payoff (36 minus K)-to-K where K is an integer 1 <= K <= 35. Your
probability of success is K/38 in each case.)
For simplicity, the casino allows any real-valued bet 0 < x <= B where B is
your present bankroll, and you'll have time for hundreds of betting rounds
if you need them.
What strategy maximizes the probability you will achieve $Y before your
bankroll disappears?
Note that the "utility of money" relevant for this problem is trivial. If
you achieve $Y (perhaps the price of an urgently needed drug) you live
happily ever after. If you end up with any amount less than $Y, you die a
gruesome death.
I'm hoping members of this group will treat this as a poll and answer
honestly, from choices like:
(A) Already knew this. Yawn.
(B) Oh, I see. Interesting!
(C) Nope, you "mathematicians" are wrong.
(* - the way I've phrased the poll assumes that some mathematicians here
will quickly give the "correct" answer. I *hope* this isn't wishful
thinking on my part....)
The problem *may* be too easy for this ng, but I am fairly confident that
(C) would be the prevalent poll answer in any non-mathematical ng. :-(
Recently in another, relatively intelligent, forum I was not surprised to
find that no one knew the answer, but quite exasperated to find that
no one seemed to even believe the answer, once presented!
Septimus G. Stevens
Always bet on a single number, making the smallest bet that will
achieve the goal if you win. In your example $2.86, $3.06, and so on
until you either win or have to bet everything. For your example, the
probability of success with this method is about 0.4808, which is
slightly better than the 18/38 ~= 0.4737 of a single $100 bet at 1:1.
If the bet quantities are quantized (as is the case in real life), the
optimum strategy may become much more complex.
If neither bets *nor* probabilities are quantized (e.g. you get to
define a sector of a continuous wheel with fixed 2/38 extra
probability of loss), then you can do better by defining
microscopically small sectors with correspondingly small bets.
E.g. 999999:1 yields 0.481422 probability of success. The probability
is bounded away from 0.5, but I have not yet been able to ascertain an
elementary expression for the supremum.
- Tim
I have now evaluated the limiting probability: it is 1 - (1 - X/Y)^p,
where p is the probability of not hitting the "house" segment of the
disk.
So with X=100, Y=200 the upper bound on success probability is
1-(1/2)^(36/38) ~= 0.4814224778.
- Tim
First, to answer OP's poll, I'd place myself
somewhere between (A) and (B). I was already
familiar with the general idea, but inspired by
Tim Little, have written a program.
(Had you seen it before, Tim? Is it "well known"?
I wouldn't know how to Google for an on-line explication.)
Tim Little wrote:
> If the bet quantities are quantized (as
> is the case in real life), the optimum
> strategy may become much more complex.
It seems fair to match ideal with ideal,
or practical with practical. In practice
the gambler probably doesn't need $200
*exactly*; he just wants to feel comfortable
taking the girlfriend to see a show.
Let me reformulate OP's problem slightly.
It may be more convenient to use "for 1"
odds instead of "to 1"; that is to call
the single-number roulette bet "36 for 1"
instead of "35 to 1." This saves some
writing right off the bat: OP's
> (36 minus K)-to-K payoff
becomes
> 36 for K
or rather, 36/K for 1.
OP's puzzle can be (more-or-less) restated as
Given a K-for-1 bet with house vigorish V, use
multiple betting rounds to construct a K'-for-1 bet.
What is then the effective vigorish V' ?
Here,
V = 1 - K*p
where p is the probability of winning the K-for-1 bet.
Actually OP asks, Given various K with equal V,
which K minimizes V'? But I'm sure Tim has already
answered that question correctly: the largest
such K is best.
Similarly, given various V with equal K, the
smallest V is best. What's interesting is when
both K and V vary. The best bet then may depend
on K'. In other words, a Keno ticket may be a
better bet than Roulette, despite its higher
vigorish, *if* your goal K' is very large.
Returning to the problem stated above, we want
to find
V' = f(K, V, K')
For any particular (K, V, K') this can be estimated
with software that implements the procedure described
by Tim Little. I've written such a program and
uploaded it to
http://james.fabpedigree.com/roulette.c
Developing the function V' analytically may be
more difficult. When K' is a power of K
log K' * log 1-V = log K * log 1-V'
and that formula may work approximately for
any very large K'.
If instead one considers very small K'
(K' = 1 + epsilon), I think there's a completely
different formula. But I'll leave that as an
exercise!! :-) :-)
For "intermediate" K', I think the V' function
is somewhat erratic. But I can find it with the
software I mentioned. Just for fun, I tried
to find where Keno becomes a better bet than
Roulette. For simplicity I consider just
three possible bets:
* Roulette red; K = 2; V = .05263
* Roulette number; K = 36; V = .05263
* Keno ticket; K = 3000; V = .26315
As seen below, Keno with its stiff-seeming 26.3%
vigorish, is better for the player than the
even-money roulette bets when K' >= 38.
Bet Goal (K') Inefficiency (V')
--------------------------------------
Red 2 .05263
Number 2 .03822 (*)
Keno 2 .20013
Red 38 .26081
Number 38 .06161
Keno 38 .26080
Red 3000 .47628
Number 3000 .13877
Keno 3000 .26315
(* - Since V' = 1 - K'*p; p here is .4809,
in close agreement with Tim's result.)
James Dow Allen
No, I hadn't seen it before. I had considered a similar problem some
years ago. It differed in some details I thought important at first,
but which turned out not to affect the overall form of the result.
It may be "well known", but if so then I managed to avoid it in
textbooks on probability.
> It seems fair to match ideal with ideal, or practical with
> practical. In practice the gambler probably doesn't need $200
> *exactly*; he just wants to feel comfortable taking the girlfriend
> to see a show.
Quite reasonable. If the target is "blurred" somewhat (e.g. with
utility of money that is not quite a step function), the effect of
quantization in simulations appears to be greatly reduced.
> OP's puzzle can be (more-or-less) restated as Given a K-for-1 bet
> with house vigorish V, use multiple betting rounds to construct a
> K'-for-1 bet. What is then the effective vigorish V' ?
Nice, I like that way of thinking about it: a transformation on bets.
> Similarly, given various V with equal K, the smallest V is best.
> What's interesting is when both K and V vary. The best bet then may
> depend on K'. In other words, a Keno ticket may be a better bet
> than Roulette, despite its higher vigorish, *if* your goal K' is
> very large.
The previous problem I had considered, based on Keno but much more
vague: each number of picks has a different payoff schedule with
slightly different average cash return. With various types of utility
function, how do their returns on utility look?
> Just for fun, I tried to find where Keno becomes a better bet than
> Roulette.
Very nice! That does illustrate the effect quite practically.
> As seen below, Keno with its stiff-seeming 26.3%
> vigorish, is better for the player than the
> even-money roulette bets when K' >= 38.
Yes, that's predicted reasonably well by your earlier approximation.
- Tim
You seem very confident.
I am not certain you have defined the optimum strategy.
Lets forget roulette wheels, too many different betting options. Lets make
it a toss of a coin, all that the player has to decide each turn is how much
to bet.
I don't think the continuous case (arbitrarily small wagers) has a solution.
The player would never make a bet large enough to cause them to lose, and
the game won't terminate. So a minimum bet size will be needed.
A reasonable first step would be determining if different strategies
(subject to constraints) affect the outcome. If it doesn't affect the
outcome, then we can use any strategy to determine the payoff, including
betting all $100 on a single throw which gives us 50:50. I suspect the
strategy does NOT affect the payoff, as long as you never make a wager which
would give you more than $200.
This is not obvious to me. If I had $150, I would probably bet $50. This
gives me at least a 0.75 chance, as I have a 50:50 chance of turning $150
into $200, or a 50:50 chance of ending up with $100 which itself gives me a
50:50 chance of gambling the lot and winning.
However, I am not certain that betting only (say) $25 when I have $150
wouldn't lead to scenarios with a better than 0.75 chance of reaching $200.
One thing we do have in our favour is that the expected return is a function
only of the amount of money we have in our kitty. There is no memory.
Can you prove that the payout is independent of the strategy, subject only
to the strategy never returning more than $200 ? If not, how to work out the
best strategy?
If you like, sure. To preserve the nature of the problem though, the
house must take a cut. So the payoff on a win is less than 2-for-1.
> I don't think the continuous case (arbitrarily small wagers) has a
> solution. The player would never make a bet large enough to cause
> them to lose, and the game won't terminate.
Failure to terminate is a failure to achieve the required payout, and
hence is considered a loss. If you prefer though, consider a minimum
bet epsilon, and determine the limiting probability of success as
epsilon -> 0 (which is actually how I calculated it).
> A reasonable first step would be determining if different strategies
> (subject to constraints) affect the outcome.
They certainly do. E.g. making many small constant bets at not quite
even money is a biased Drunkard's Walk, and will almost certainly
fail.
- Tim
I'm too lazy or stupid to come up with rigorous proof of Tim's claim,
but I think a simple example might tend to convince.
Restricting to even-money bets, for convenience, as you've done, and
assuming win chance p; you propose to replace a $100 bet with two $50
bets.
Win-lose and Lose-win have no effect, so the only relevant cases are
Win-Win (prob = p*p) and Lose-Lose (prob = (1-p)*(1-p)). The efficacy
is determined by the ratio of Win and Loss probabilities, but
p*p/((1-p)*(1-p)) < p/(1-p)
when p < .5. QED. :-)
If p is *greater* than .5, the situation reverses!! Then your
"make smallish bets" plan *does* become the winning solution!
Of course, if your bets are *too* small, you'll be very very
old before you enjoy your winnings. For this reason you use
Kelly's theorem to determine bet size:
http://en.wikipedia.org/wiki/Kelly_criterion
But that's *only* when your opponent is offering you "Negative
vigorish". In this thread we consider positive vigorish,
a completely different case.
James Dow Allen
Multi-payoff tickets certainly complicate the analysis. Let's
stick to the simple cases first. :-)
{\begin not mathematics}
I don't want to derail this interesting thread, but perhaps
should point out an important practical point for Keno players.
There is some threshold ($1500 ?) beyond which many or most
U.S.A. casinos will report a winning ticket to the Taxation
Authorities. Thus a $1400 win might be much more profitable
than a $1600 win.
(Of course I know no one here would try to cheat on their taxes!
But your winnings, whether $1400 or $1600, are probably offset
by expenses: losing tickets, hotels, relaxation tours at
nearby Nevada dude ranches, etc. It's tedious to accumulate
such receipts (which some of the dude ranches don't provide
anyway), so the $1400 win is more convenient than a $1600 win.)
I never played much Keno, but found this out from a casino
cashier when I inquired about the peculiar warning sign:
"All winnings must be paid in full." You get a lot of customers
who ask for *less* than they won?, I asked saracstically.
"Yes!!" She answered, and explained why.
{\end not mathematics}
James
I'm too lazy or stupid to come up with rigorous proof of Tim's claim,
but I think a simple example might tend to convince.
________________________________
I think I can solve/prove it.
I will assume a 50:50 game as before, limits $0 and $200, starting with $n,
in the first case $100.
The key is recasting this as a Expectation problem.
If you start with $100, your expected return under any strategy is $100, or
else you could play back to back games and win over the casino.
So Expectation for $100 is $100.
The expected return cannot be changed by planning any series of bets in
advance (ie by a strategy), as each bet involves a 50:50 chance of wining or
losing the same amount of money, which does not change the expectation.
Therefore if you start with $n you have a n/200 probability of ending up
with $200 and a 1 - n/200 probability of ending up with $0. Irrespective of
any strategy you may wish to employ, as long as it doesn't deliberately bet
to make more than $200.
I'm afraid you "took your eyes off the ball" at sokme point and solved
for the case where "vigorish" is exactly zero! That's too easy.
Only positive vigorish (Tim's method) and negative vigorish (Kelly)
are of interest.
BTW, I'd bet even *I* could prove Tim's Theorem for even-money bets.
We need the general case.
James
>
> It may be "well known", but if so then I managed to avoid it in
> textbooks on probability.
Textbooks on probability tend not to assume a familiarity with gambling:
dice and packs of cards may be assumed understood, roulette wheels not.
--
Mathematics is a part of physics.
Physics is an experimental science, a part of natural science.
Mathematics is the part of physics where experiments are cheap.
(V.I. Arnold)
> I propose a simple gambling problem, which I'm afraid might be too easy for
> this group. My *real* question is:
> How well known is this problem and its solution?
>
>
> You have $X but need $Y. (I don't think X and Y matter much; let X = 100,
> Y = 200 if you like.) Your only option is a roulette wheel, with American
> rules. American roulette allows various bets, all with identical "house
> commissions" but different pay-offs, 1-1, 2-1, 35-1, etc. (Indeed any
> payoff (36 minus K)-to-K where K is an integer 1 <= K <= 35. Your
> probability of success is K/38 in each case.)
>
> For simplicity, the casino allows any real-valued bet 0 < x <= B where B is
> your present bankroll, and you'll have time for hundreds of betting rounds
> if you need them.
Can I take a computer into the casino and run a hefty prog between spins?
(I'm assuming I'm not allowed to cheat by using the computer once the
ball is rolling to utilise physics, and equally the casino and croupier
are not allowed to cheat with non-random spins.)
>
> What strategy maximizes the probability you will achieve $Y before your
> bankroll disappears?
>
> Note that the "utility of money" relevant for this problem is trivial. If
> you achieve $Y (perhaps the price of an urgently needed drug) you live
> happily ever after. If you end up with any amount less than $Y, you die a
> gruesome death.
>
>
> I'm hoping members of this group will treat this as a poll and answer
> honestly, from choices like:
> (A) Already knew this. Yawn.
> (B) Oh, I see. Interesting!
> (C) Nope, you "mathematicians" are wrong.
>
> (* - the way I've phrased the poll assumes that some mathematicians here
> will quickly give the "correct" answer. I *hope* this isn't wishful
> thinking on my part....)
>
>
> The problem *may* be too easy for this ng, but I am fairly confident that
> (C) would be the prevalent poll answer in any non-mathematical ng. :-(
> Recently in another, relatively intelligent, forum I was not surprised to
> find that no one knew the answer, but quite exasperated to find that
> no one seemed to even believe the answer, once presented!
>
> Septimus G. Stevens
(C) if I can use a computer, Tim's solution otherwise.
Evil Nigel