Two Plus Two Newer Archives

Two Plus Two Newer Archives (http://archives1.twoplustwo.com/index.php)
-   Books and Publications (http://archives1.twoplustwo.com/forumdisplay.php?f=35)
-   -   The Mathematics of Poker (http://archives1.twoplustwo.com/showthread.php?t=250412)

wpr101 02-24-2007 09:54 PM

Re: Pages 36 & 18
 
I haven't got all the way through the book yet but it is probably the best book written for a player who is already a winning player and understands math well.

KurtSF 02-26-2007 03:20 PM

Reading advice
 
Howdy all,

I've read this thread in dribs and drabs through the months, but I can't recall if this is addressed anywhere. Apologies if it is.

Some background. In the general population I am a math wiz; I've always been good at it and its always come naturally. Big whoop, right. [img]/images/graemlins/smile.gif[/img] In reality, I took a few advanced classes but largely got off the boat at calculus. I do have have a deep understanding of math through this level, often being able to "estimate" a answer in my head without doing any of the calculations, just visualizing the problem. But higher maths are a complete mystery to me. I loved the fact that when I started playing LHE, instead of being the guy who was good at math, I was the guy playing catch-up. It seemed the "good" players were always 1 calculation ahead of me. I worked my butt off to understand the math behind a poker hand, and feel it made a decent but not great player. I have since switched to NLH and now I find that the "good" players do NOT have the kind of math background the good LHE players have. I feel that the solid mathematical basis I have for my game gives me a leg up on the competition.

I don't think I have ever been as excited about reading a poker book as I am about this. Now, the questions:

How much can I get out of the book by trying to understand the concepts without fully grasping the math? Put another way, can I take lessons away from this book that can be directly applied to my poker game without being able to solve the equations presented? And mostly, what tactics and tips would you give for how to approach/read this book? I want to get the most out of it, as someone who is overmatched by the math involved but not intimidated.

Many thanks for reading this, and many more for taking the time to respond!

sputum 02-26-2007 03:48 PM

Re: Reading advice
 
[ QUOTE ]
In the general population I am a math wiz

[/ QUOTE ]

[ QUOTE ]
someone who is overmatched by the math involved but not intimidated

[/ QUOTE ]
The math is mostly pretty simple (and I'm NOT a math wiz [img]/images/graemlins/smile.gif[/img])
I'm just reading it again and again. But that's what I usually do with poker books. Then I'll mess about calculating different toy stuff for a bit (simulation or formally, get any integrals solved in the math forum [img]/images/graemlins/blush.gif[/img])
The problem is that in the micros and small stakes this approach to play is dominated by playing 'suboptimally' because you're playing donks. Who wants to make a donk indifferent to calling or folding if he calls incorrectly anyway?
Not a criticism of the book in any way shape or form of course. I love it.

EDIT: btw, they solve the equations for you.

mosquito 02-28-2007 03:53 AM

Re: Book at Pokerstars???
 
[ QUOTE ]
[ QUOTE ]
I looked today and saw it for 1000 FPPs, surprisingly less than most of the books available.

[/ QUOTE ]

Thanks for the heads-up. Just ordered it.

[/ QUOTE ]

Don't know why, I can't find it. Have to be Silver Star?

sobefuddled 02-28-2007 09:50 AM

Re: Book at Pokerstars???
 
No. Go to the main menu. Click cashier. Click FPP Store and you will find it on the drop-down menu.

mosquito 03-01-2007 09:14 PM

Re: Book at Pokerstars???
 
[ QUOTE ]
No. Go to the main menu. Click cashier. Click FPP Store and you will find it on the drop-down menu.

[/ QUOTE ]

Not there for me. Blechh. Can someone else confirm that it's still there?

sobefuddled 03-02-2007 12:15 AM

Re: Book at Pokerstars???
 
You're right. It's gone now. I bet it's out of stock. I'd keep checking back there tho. I'm sure it will be back in stock soon.

mosquito 03-03-2007 02:10 AM

Re: Book at Pokerstars???
 
Hope you are right. [img]/images/graemlins/smile.gif[/img] I would hate to have missed it. [img]/images/graemlins/mad.gif[/img]

Martin Pescador 03-04-2007 12:33 PM

Re: The Mathematics of Poker
 
I'm reading this book, and I'm having difficulty understanding the formula below the table on p. 154. The left-hand side seems to be the ex-showdown equity of Y betting, but I don't understand where the constant k on the right-hand side comes from. Isn't the ex-showdown equity of Y checking 0? Can somebody please explain this to me?

Best regards,
Martin

Goldmund 03-04-2007 04:00 PM

Sklansky-Chubukov and The MoP
 
What do the authors think of he Sklansky-Chubukov Rankings in NLHTAP as compared to their own push/fold tables in the jam or fold game?

Sklansky-Chubukov's raising standards are a lot tighter than those advocated in the table on page 135 of MoP, but the MoP strategy is in jam-fold-only scenarios.

What's the value of the Sklansky-Chubukov numbers in late, huge blinds (efective stacks say 6-12 Big Blinds) Tournament play. It seems MoP advocates pushing a lot more often than S-C. I know that the S-C charts are based on an oppt. 'knowing' our holecards and still playing an unexploitable game, but does the pushing strategy in MoP not lead to oppts calling with a very wide range after we pushed into them a few times consecutively, decreasing our fold equity to a point that a heads-up ecounter becomes a coinflip+? Goldmund

Shandrax 03-07-2007 07:56 AM

Re: Sklansky-Chubukov and The MoP
 
I would like to suggest a follow-up to Mathematics of Poker.

It could be called like "Practical Application" and it should contain sample hands that are analysed in the same fashion as the Stud 8/b hand in the early chapters. The examples should cover a variety of different poker forms - basically HORSE - and it should also cover different types of limits and cash/tournament structues.

cowhead 03-07-2007 01:12 PM

Re: Sklansky-Chubukov and The MoP
 
Is there a posted errata besides this thread? I searched but didnt' find anything

Sharikov 03-07-2007 03:10 PM

Re: Sklansky-Chubukov and The MoP
 
[ QUOTE ]
Is there a posted errata besides this thread? I searched but didnt' find anything

[/ QUOTE ]
http://www.conjelco.com/mathofpoker/...ker-errata.pdf

Gelford 03-15-2007 03:32 PM

Re: Sklansky-Chubukov and The MoP
 
Grunch


That is some annoying small typesetting you've got there, if the book was in a bigger format it be a straight 10

Patrick Sileo 03-18-2007 01:43 PM

MoP erreata correction
 
[ QUOTE ]
[ QUOTE ]
Is there a posted errata besides this thread? I searched but didnt' find anything

[/ QUOTE ]
http://www.conjelco.com/mathofpoker/...ker-errata.pdf

[/ QUOTE ]

The posted correction to page 48 fixes the text, but not the equations. It should go on to say that the last two of the three expressions for <B,call> should be divided by (0.2 + x), with appropriate consequent adjustments to the following table and graph.

mosquito 03-21-2007 08:36 PM

Re: Book at Pokerstars???
 
[ QUOTE ]
You're right. It's gone now. I bet it's out of stock. I'd keep checking back there tho. I'm sure it will be back in stock soon.

[/ QUOTE ]

Back in stock. Just ordered today. [img]/images/graemlins/cool.gif[/img]

burningyen 03-22-2007 03:24 PM

Re: Book at Pokerstars???
 
Question re bluff-raising in the toy game on page 204: in what sense does bluff-raising with the top of your check-behind region "dominate" bluff-raising with the bottom of that region?

Jerrod Ankenman 03-22-2007 05:20 PM

Re: Book at Pokerstars???
 
[ QUOTE ]
Question re bluff-raising in the toy game on page 204: in what sense does bluff-raising with the top of your check-behind region "dominate" bluff-raising with the bottom of that region?

[/ QUOTE ]

In the situation described, the opponent has already bet, and we're trying to decide what to do with our hands that are too weak to raise for value or call. Some of them will be folded; others will be bluff-raises.

Most of the time, it won't matter which hands we bluff-raise with (as long is it is the proper frequency); our opponent won't call with any hands that can't beat a bluff-raise anyway, so whatever. In fact, choosing any subset of hands (such that you raise-bluff with the proper frequency) from the folding region to bluff-raise is a co-optimal strategy. The opponent can't exploit it -- try it and see!

However, suppose you're playing against someone silly, who will bluff with his worst hands and then call your bluff-raise. Then it's better for you to bluff when the very strongest hands of your (fold+bluffraise) region, since you get some extra value from the silly guy. You don't lose anything against reasonable players by doing this - they only call with hands that can beat you anyway.

So raise-bluffing with the best of the hands you would fold dominates other strategies because it performs better against silly strategies and the same against reasonable ones.

This is different than the case where the guy checks to you. Now you should bluff your worst hands in your "check behind" region, because they have less value from checking than the others. But if we're selecting bluffs from a region which would otherwise be folded, we select the strongest hands to bluff.

jerrod

Troll_Inc 03-23-2007 12:24 PM

Re: The Mathematics of Poker
 
[ QUOTE ]
Is this the appropriate place for extensive content discussion? So far, I like it and expect it to spawn some lively discussion.

I did pick up one error on page 48. The unconditional probabilities p(A_has_the_nuts) and p(A_has_a_bluff) in the equation for <B,call> should be replaced by probabilities conditional on the event "A_bets".

p(A_has_the_nuts|A_bets) = 0.2/(0.2+x)

p(A_has_a_bluff|A_bets) = x/(0.2+x)

Notice that these will sum to 1. This change leaves the critical value x*=0.04 unaffected, but game value will now be seen to be a non-linear function of x. The primary conclusions don't depend on linearity and are unaffected.

A similar error occurs on page 56 in the expression for <A,call>.


[/ QUOTE ]

I'm trying to use this formula to solve a problem on river bluffing, and I don't get your comment on non-linearity.

It seems that if you convert to a formula where 0.2 is really 1-x. (In the book example, x=80% as a given value of bluffing, so 1-x = 0.2.)

So if you plug 1-x into everywhere you have 0.20 doesn't the denominator sum to 1 (as you state), and therefore revert to the original formula (which doesn't have a fraction)? And further, then isn't it linear?

Or are you saying that the solution is non-linear if player A doesn't bet 100% of the time? Presumably this would be done with the expectation of sometimes being able to check-raise?

HelixTrix 03-24-2007 03:08 PM

Re: Sklansky-Chubukov and The MoP
 
Quote:

"What do the authors think of he Sklansky-Chubukov Rankings in NLHTAP as compared to their own push/fold tables in the jam or fold game?

Sklansky-Chubukov's raising standards are a lot tighter than those advocated in the table on page 135 of MoP, but the MoP strategy is in jam-fold-only scenarios.

What's the value of the Sklansky-Chubukov numbers in late, huge blinds (efective stacks say 6-12 Big Blinds) Tournament play. It seems MoP advocates pushing a lot more often than S-C. I know that the S-C charts are based on an oppt. 'knowing' our holecards and still playing an unexploitable game, but does the pushing strategy in MoP not lead to oppts calling with a very wide range after we pushed into them a few times consecutively, decreasing our fold equity to a point that a heads-up ecounter becomes a coinflip+? Goldmund"


Yeah, this has been bothering me too. I use the S-C numbers constantly towards the end of sit n gos, but when I glance at these 'jam or fold' tables they scare me. At the moment I am struggling to think of these as more than a mathematical curiosity, although I have yet to get round to sitting down and trying to see how jam-or-fold, S-C numbers and ICM calculations can be brought together into a coherent strategy. I do think there is a danger that people will start using these unnecessarily in blind v blind and headsup situations thinking that 'optimal' means best, particularly given the authors' claim that these tables justify the price of the book alone.

Previous posters have mentioned the ambiguous use of the word 'optimal'. Speaking as someone with no background in any of the specialist disciplines that use the term, it seems to me that what could have been stressed a lot more, is that when the authors talk about a strategy being 'optimal', what they mean is that it is part of a 'co-optimal pair', i.e. if you are playing against a 'nemesis' opponent who would otherwise instantly switch to counter everything you do. While this approach is extremely helpful in other pre- and post-flop situations as a default option when you feel you might be outclassed or just generally can't think of a way to exploit anything,I'm not convinced these numbers are anywhere near the 'best' strategy in actual low-M situations that will come up. No opponent is likely to call my jam with 87o with an M of 7 (4.7 BB), as apparently he should as part of a co-optimal strategy pair, so why on earth would I jam with T5o(!) with an M of 6 (4.1 BB), reasoning 'it's optimal so it must be better than folding'? Actually my point about the opponent is a bit spurious since if he is generally calling tighter then I could theoretically be pushing even looser. More important is Sklansky's idea of passing up on a slightly good bet today if you're likely to be able to make a better one tomorrow. And yes, I know there isn't much time to wait for a hand with a short stack, but it must be reasonable to fold T5o since you are a favourite to be dealt at least one card higher than a ten on the next hand. My point about the opponent not using the optimal strategy certainly does apply to calling though. Now clearly you shouldn't call with 87o if he isn't going to push with T5o. I also don't like the author's recommendation to begin 'jam-or-fold' at an M of 7, or as high as 9 (I'm switching from their use of BBs as a unit). Or rather, I agree all-in is generally the only option at this point, but again, that doesn't mean 'it must be best to blindly start following the tables'.

As a final point, many opponents play extrememly exploitably, by continuing to allow you to limp in in the small blind, getting 3 to 1 headsup on the button, which you should do with any hand, and also never limp themselves with a good hand so you can then jam with any 2. Anyway, I'd love to hear others' opinions on this; I was surprised reading through this thread not to have seen more mention of the scale of difference between the MOP numbers and the S-C ones. The rest of the book I think is fantastic btw.

Patrick Sileo 03-26-2007 03:26 PM

Re: The Mathematics of Poker
 
[ QUOTE ]
[ QUOTE ]
Is this the appropriate place for extensive content discussion? So far, I like it and expect it to spawn some lively discussion.

I did pick up one error on page 48. The unconditional probabilities p(A_has_the_nuts) and p(A_has_a_bluff) in the equation for <B,call> should be replaced by probabilities conditional on the event "A_bets".

p(A_has_the_nuts|A_bets) = 0.2/(0.2+x)

p(A_has_a_bluff|A_bets) = x/(0.2+x)

Notice that these will sum to 1. This change leaves the critical value x*=0.04 unaffected, but game value will now be seen to be a non-linear function of x. The primary conclusions don't depend on linearity and are unaffected.

A similar error occurs on page 56 in the expression for <A,call>.


[/ QUOTE ]

I'm trying to use this formula to solve a problem on river bluffing, and I don't get your comment on non-linearity.

It seems that if you convert to a formula where 0.2 is really 1-x. (In the book example, x=80% as a given value of bluffing, so 1-x = 0.2.)

So if you plug 1-x into everywhere you have 0.20 doesn't the denominator sum to 1 (as you state), and therefore revert to the original formula (which doesn't have a fraction)? And further, then isn't it linear?

Or are you saying that the solution is non-linear if player A doesn't bet 100% of the time? Presumably this would be done with the expectation of sometimes being able to check-raise?

[/ QUOTE ]

I was simply pointing out a minor mathematical misstep. The key result of the problem being analyzed depends only on the critical value, which is unchanged by the correction. Hence, the relegation of the matter to the "minor correction" category.

dfan 04-15-2007 12:26 PM

Re: The Mathematics of Poker
 
I think it is more than a mathematical misstep. The formulas are described as P(A has the nuts) and P(A has a bluff). This strongly suggests that the situation is that A has bet. If so, then .2 is NOT the prob that A has the nuts (unless A bluffs every hand), nor is x the prob that A has bluffed. The errata is incomplete to my way of thinking. It was very confusing to me as written.

If the equations were conditioned on A having bet then the "answer" for the bluffing frequency that would make B indifferent to calling or folding would be P(A has nuts)=.2 (not .04), which corresponds to the odds B is receiving, 5 to 1. In other words this would be an example of Sklansky's principle that your bluffing frequency should equal the reciprocal of the odds you are giving.

mwette 06-10-2007 11:19 PM

Re: question about the book and the CLT
 
[ QUOTE ]
[ QUOTE ]
For example mamy of the problems in finance, gambling, and operations research and control theory turn out to be very similar,...

[/ QUOTE ]

Would you expand on this a little? What does control theory have in common with game theory? TIA.

[/ QUOTE ]

It makes sense (to me) to treat poker as a problem in
stochastic control theory. The control system is the hero
with controller (hero's betting strategy) and estimator
(to determine villians' card distributions). The system
under control is the rest of the game (board, pot, stacks,
the villians' cards and other players strategies). The
control signal is the bets being placed (by the hero). The
measurements are common cards, pot size, stack sizes, and
bets being placed (by the villians). The controller is
tuned to provide the maximum EV for the hero.

The field of H-infinity control, which has been around for
about 25 years, has lots of connections to minimax
optimization and game theory, I believe. If you want more
on the game-theory / control-theory connection you might
try wikipedia.org.

bbbushu 07-01-2007 05:28 PM

Bump
 
Chen, Ankenman, others,

***WARNING*** I don’t have a very strong math background/mind so this question’s answer might be obv to everybody else but I’m asking it anyways, thanks for not flaming. I also don’t know how to write some of the notation, but anybody with the book can figure out what exactly I’m saying. Also sorry for the mega-bump. ***WARNING***

Chapter 16 – [0,1] Game #8: The Raising Game with Check-Raise

[…]

The optimal strategy for Player Y is something like:

[ QUOTE ]
Yn = r^n / (1 – r )

[/ QUOTE ] p. 191

My question is: what does r stand for?

There is an example given which says:

[ QUOTE ]
Y1 = 1 / (1 – r), and Y puts in the (n + 1)th bet with a fraction of the hands with which he put in the nth bet equal to r.

[/ QUOTE ] p. 191

*Perhaps because I just don’t understand r, I don’t know how to apply this example for Y2, Y3, Yn … – 1 is usually different a lot of the time because it equals itself so often.

*I know that r for the (n + 1)th bet is always a number closer to 0 than r for (n + 1) – I understand that each bet requires a stronger hand (while I’m unsure if my explanation makes sense…).

*I think that r is whatever fraction Y puts in the nth bet with, but I don’t understand how to determine it’s increase for the (n + 1)th bet.

*It’s explained that this equation differs from the previous game (where no check/raise was allowed, only raising) in that Y only need to know their thresholds in relation to their own other thresholds (Y1, Y, Y3, Yn) as opposed to in relation to X2, X4, Xn and that’s what factors of r separates. You say:

[ QUOTE ]
Yn = rY(n – 1)

[/ QUOTE ]

… I am not sure if this second equation is Y’s total strategy, or another strategy (doubtful), etc.

An explanation/criticism of what I wrote would be excellent. A non-1 example (or two) would be awesome!


Thanks,
bbbushu

p.s. I will almost definitely also ask these rudimentary questions about X’s strategy for this game after this part makes sense to me [img]/images/graemlins/wink.gif[/img]

parachute 07-02-2007 09:36 AM

Re: Bump
 
r is sqrt(2) - 1, as defined previously in the book. It's referred to more explicitly in the appendix to that chapter where the results you cite are derived.

bbbushu 07-02-2007 02:42 PM

Re: Bump
 
so r always = sqrt(2) - 1 (e.g. 'golden mean of poker')?

that was my guess (i read the appendix to help but still wasn't sure - i must have missed it earlier.

thanks
bbbushu

Shroomy 07-05-2007 11:16 PM

Re: Sklansky-Chubukov and The MoP
 
HelixTrix ...

Good post and many of my thoughts too...
but your point of opponent being unlikely of calling with anything in the lower range of the optimal stategy argues for jamming with more hands than optimal not less.

And its pretty much been proven (or at least not disproven) that jam or fold is at minimum very close the best exploitive stategy.

The only real argument is if it applies to SB vs BB due to card clumping. (aka.. if 8 players folded the BB is more likely to have a above average hand)

notreallymyname 08-21-2007 09:56 AM

Re: Sklansky-Chubukov and The MoP
 
I think I've found an incorrect conclusion on top of an incorrect derivation, could someone confirm?

On p.117 (Example 11.3, half-street [0,1] with folding) the following is given as X's equity for a strategy which calls [0,x_1*] when Y bets [0,y_1] in a pot of P bets:

<X> = (P + 1)(y_1 - x_1*) - 1(x_1*)

It should be immediately obvious that this is wrong because the expression (y_1 - x_1*) is meant to be the probability of calling and winning but isn't guaranteed to be in the range [0,1]. Even for x_1* < y_1 this is wrong because X not only wins when Y is betting [x_1*,y_1] but half of the rest of the time too. (The probability of calling and losing in this equation is similarly flawed.) The third inequality below this has been accidentally negated too.

The conclusion that X needs a 1/(P + 1) chance of having the best hand to profitably call is right, but the threshold for this is not at y_1(P + 1)/(P + 2) as stated.

Some of the following work on optimal strategy for this game also makes the assumption that X always loses when Y's hand is in his calling range. The whole section needs to be checked really, I'm just going to skip it after finding such an obvious systematic error, it would just be confusing to do otherwise.

Can I really be the only one who got this far?

notreallymyname 08-22-2007 01:07 AM

Re: Sklansky-Chubukov and The MoP
 
Ah, never mind, this is correct (although the final inequality is still wrong), just unnecessarily confusing.

invrnrv 10-13-2007 05:13 PM

another error
 
page 97, there should be 4 bets in the pot, not 3. 3 bets make the optimal bluffing frequency 1/16, not 1/20 (5%). 4 bets make it 5%.

FoxInTheHenHouse 10-14-2007 07:38 PM

Re: Good News/Bad News/Good News
 
[ QUOTE ]
finished the book, though i haven't read this thread.

for NL cash (i play small stakes), experience is much more important than the math approach presented in this book.

after having played hundreds of thousands of hands of NL cash, i don't need math (beyond basic probability/odds) to make decisions. if i am the pfr and the caller leads small on the flop, they have a draw or marginal hand and will fold just about every time to a raise. if i raise pf, unless the opponent is shortstacked, a reraise is never AK/QQ or worse - they call with those hands. if someones minbets turn, they have a draw/marginal hand. if someone raises a three flush board on the river, they have the nuts or 2nd nuts. if i am the pfr and get check-raised on the flop or turn, my top pair (especially if it is a K high board) is no good. and so on. i don't use math for any of these because the scenarios keep repeating themselves.

so, i would definitely try to get as much experience as possible for NL rather than try to use the book's math approach of "opponent will bluff X% of the time here yet fold to a rebluff Y% of the time, so I have to win Z% of the time for it to be profitable." that appraoch is fine once in a blue moon against a tough player but for 99% of the hands, using hand reading that comes with experience is the way to go in my opinion.

[/ QUOTE ]

Thank you!! This is the first post in this thread that makes any sense!!

WRX 10-26-2007 03:19 AM

Re: The Mathematics of Poker
 
Bill and Jerrod, I hope that you're still monitoring this thread, even though it's been quiet recently.

I've had lots of thoughts and questions popping up in my mind since I started reading your book, but have waited to finish it before writing anything. It took me a LONG time to work through it all--but I think the patient approach has been rewarded a greater understanding of the game.

Taken as a whole, I believe the book is an impressive achievement, a real landmark in poker writing. You've looked at so many topics that any player who wants to be successful needs to think about--not just questions of how to mix value betting, bluffing, calling, and folding, although study of those questions form the heart of the book. Even though it offers few recipes for play, it's a book that any serious student of poker MUST tackle. And especially, all future authors who hope to write about poker at anything but a superficial level will have to understand these concepts. That's just obvious.

I do feel the need to mention one serious reservation--which is that the book suffers from way too many typos. The errata sheet you've put up only scratches the surface. I hope that for future editions, you'll bring in an editor to proofread the copy carefully. That's not an easy task for a work this technical.

While probably every chapter led to questions in my mind about practical application to playing poker, one topic seems especially important. In the final chapter, you revisit the question of the benefits of aspiring to optimal play, versus the benefits of exploitive play. This may be the easiest-reading chapter of the whole book, but it makes sense only with the background of all that has gone before. You make the vital point that while optimal strategies are elusive and difficult to ascertain (and don't even exist in any rigorous sense in multi-player games), not all suboptimal strategies give up the same value to superior opposition. A balanced strategy remains unexploitable, so that opponents' potential edge against that strategy is limited. You advance this as an argument for striving to play strategies that are at least balanced.

Furthermore, you show that in many circumstances, an optimal or near-optimal strategy gives the player an edge against non-optimal opposition. Obvious examples would be playing against opponents who put money in the pot with trash starting hands, or who call with near-hopeless hands on the end. On the other hand, there are certainly money-making opportunities for the player willing to put aside the quest for optimal play, and to exploit an opponent's unbalanced play.

These points and others raise what may be the most significant question facing a player who wants to find the best money-making opportunities. For the kinds of games commonly found today (habits and caliber of player), in public card rooms and on the Internet, where does the most money lie? Is it in playing fundamentally sound, "near-optimal" poker? Or is it in recognizing flaws in opponents' play, and altering one's own play to exploit those flaws?

Certainly the answer might differ greatly between games--for example, a game in which all players are experienced and reasonably observant, and capable of counter-exploitation, versus a game full of unskilled, casual gamblers.

Any thoughts on this topic would be welcome.

MichaelBolton777 11-04-2007 04:22 AM

Re: The Mathematics of Poker
 
[ QUOTE ]
Bill and Jerrod, I hope that you're still monitoring this thread, even though it's been quiet recently.

I've had lots of thoughts and questions popping up in my mind since I started reading your book, but have waited to finish it before writing anything. It took me a LONG time to work through it all--but I think the patient approach has been rewarded a greater understanding of the game.

Taken as a whole, I believe the book is an impressive achievement, a real landmark in poker writing. You've looked at so many topics that any player who wants to be successful needs to think about--not just questions of how to mix value betting, bluffing, calling, and folding, although study of those questions form the heart of the book. Even though it offers few recipes for play, it's a book that any serious student of poker MUST tackle. And especially, all future authors who hope to write about poker at anything but a superficial level will have to understand these concepts. That's just obvious.

I do feel the need to mention one serious reservation--which is that the book suffers from way too many typos. The errata sheet you've put up only scratches the surface. I hope that for future editions, you'll bring in an editor to proofread the copy carefully. That's not an easy task for a work this technical.

While probably every chapter led to questions in my mind about practical application to playing poker, one topic seems especially important. In the final chapter, you revisit the question of the benefits of aspiring to optimal play, versus the benefits of exploitive play. This may be the easiest-reading chapter of the whole book, but it makes sense only with the background of all that has gone before. You make the vital point that while optimal strategies are elusive and difficult to ascertain (and don't even exist in any rigorous sense in multi-player games), not all suboptimal strategies give up the same value to superior opposition. A balanced strategy remains unexploitable, so that opponents' potential edge against that strategy is limited. You advance this as an argument for striving to play strategies that are at least balanced.

Furthermore, you show that in many circumstances, an optimal or near-optimal strategy gives the player an edge against non-optimal opposition. Obvious examples would be playing against opponents who put money in the pot with trash starting hands, or who call with near-hopeless hands on the end. On the other hand, there are certainly money-making opportunities for the player willing to put aside the quest for optimal play, and to exploit an opponent's unbalanced play.

These points and others raise what may be the most significant question facing a player who wants to find the best money-making opportunities. For the kinds of games commonly found today (habits and caliber of player), in public card rooms and on the Internet, where does the most money lie? Is it in playing fundamentally sound, "near-optimal" poker? Or is it in recognizing flaws in opponents' play, and altering one's own play to exploit those flaws?

Certainly the answer might differ greatly between games--for example, a game in which all players are experienced and reasonably observant, and capable of counter-exploitation, versus a game full of unskilled, casual gamblers.

Any thoughts on this topic would be welcome.

[/ QUOTE ]


Interesting post. Ive followed this thread a bit, and have waded through the book for awhile. Obviously, the authors are very intelligent and have a solid fundaental understanding of poker.

At the beginning of the book, the intro says that ultimately the point of what they are trying to do is to teach us how to make money playing poker- extraneous math will be left out! I am no genius, but can follow the examples and toy games at least to some degree. I still have no idea, though, how any of this will help me make more money playing poker! It is difficult enough for an average joe like me to follow the numerous toy games and hypos, let alone create my own that will teach me exactly what range of hands is optimal for me to raise from the sb in limit holdem, or how often to check raise the turn w/ air on a paired board, etc, etc! Again, I respect the knowledge and effort of the authors (and am also a big fan of hosstbf!). I am just frustrated b/c i dont see how i can possibly use this info to build a coherent, solid, and closer to optimal strategy specifically for limit holdem (or NL, which i am trying to learn as well). I heard that hoss basically patterned his limit game off of this book. He must be a super genius!

Would appreciate it, Mr. Chen, if you could give your take on how a guy like me (w/ good basic math skills, but certainly nothing extraordinary), could best utilize the info in your book to specifically improve and adjust his limit (or NL) play to progress toward that elusive impossibility of 'optimal' play.


**Also, as above poster said, I am curious about the difference between optimal and exploitive. From my reading, it seems that optimal is ultimately superior (but not certain whether this is what the authors are saying- it seems obvious that in certain situations exploitive play would be far more +ev). I, too, would love to hear Bill and Jerrod discuss when and why it would be best to use an optimal approach, or to use an exploitive one. (btw, am i correct to assume that optimal play does not require any reads or attention to the play of your opponents? i.e. no stats, reads, etc...if so, this seems very strange to me, since basically every good player on 2 plus 2, every poker writer, etc, always takes an exploitive approach, distinguishing lags and tags, etc, etc, ad infinitum....)

Would love to hear some thoughts on this, guys. Teach me to play like hoss!! Thanks.

Pirana 11-22-2007 04:25 AM

Equity Formulas
 
Can anybody provide a numerical example of how to calculate the Malmuth-Weitzman or Malmuth-Harville equity formulas in Chapter 27? Would the Malmuth-Harville Formula give the same answers as using an Independent Chip Model calculator?

Using an ICM calculator, here are the equities for the example in the chapter:
Player 1 - $1,511.17
Player 2 - $1,416.08
Player 3 - $1,139.07
Player 4 - $933.69

eurythmech 11-22-2007 10:29 AM

Re: Equity Formulas
 
Where can I find these Heads-up charts everyone's talking about? I've been looking for them in the book for quite a while now [img]/images/graemlins/frown.gif[/img]

rakemeplz 11-23-2007 03:53 AM

Re: Equity Formulas
 
p136

blues_boy_b 11-25-2007 10:19 AM

question on the mixture of draws game
 
p.253:
[ QUOTE ]
If he plays y weak closed draws, he will have a made hand 1/9y+1/10 of the time on the river

[/ QUOTE ]

i guess 1/10 are the flush draws and 1/9y are the weak close draws which are made hands. But i dont understand why it is exactly 1/9y, where does the 1/9 come from? :/
Can someone exlain it to me? [img]/images/graemlins/smile.gif[/img]


All times are GMT -4. The time now is 09:08 AM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.