Two Plus Two Newer Archives  

Go Back   Two Plus Two Newer Archives > General Poker Discussion > Books and Publications
FAQ Community Calendar Today's Posts Search

Reply
 
Thread Tools Display Modes
  #381  
Old 03-26-2007, 03:26 PM
Patrick Sileo Patrick Sileo is offline
Junior Member
 
Join Date: Dec 2006
Posts: 19
Default Re: The Mathematics of Poker

[ QUOTE ]
[ QUOTE ]
Is this the appropriate place for extensive content discussion? So far, I like it and expect it to spawn some lively discussion.

I did pick up one error on page 48. The unconditional probabilities p(A_has_the_nuts) and p(A_has_a_bluff) in the equation for <B,call> should be replaced by probabilities conditional on the event "A_bets".

p(A_has_the_nuts|A_bets) = 0.2/(0.2+x)

p(A_has_a_bluff|A_bets) = x/(0.2+x)

Notice that these will sum to 1. This change leaves the critical value x*=0.04 unaffected, but game value will now be seen to be a non-linear function of x. The primary conclusions don't depend on linearity and are unaffected.

A similar error occurs on page 56 in the expression for <A,call>.


[/ QUOTE ]

I'm trying to use this formula to solve a problem on river bluffing, and I don't get your comment on non-linearity.

It seems that if you convert to a formula where 0.2 is really 1-x. (In the book example, x=80% as a given value of bluffing, so 1-x = 0.2.)

So if you plug 1-x into everywhere you have 0.20 doesn't the denominator sum to 1 (as you state), and therefore revert to the original formula (which doesn't have a fraction)? And further, then isn't it linear?

Or are you saying that the solution is non-linear if player A doesn't bet 100% of the time? Presumably this would be done with the expectation of sometimes being able to check-raise?

[/ QUOTE ]

I was simply pointing out a minor mathematical misstep. The key result of the problem being analyzed depends only on the critical value, which is unchanged by the correction. Hence, the relegation of the matter to the "minor correction" category.
Reply With Quote
  #382  
Old 04-15-2007, 12:26 PM
dfan dfan is offline
Senior Member
 
Join Date: Feb 2005
Posts: 226
Default Re: The Mathematics of Poker

I think it is more than a mathematical misstep. The formulas are described as P(A has the nuts) and P(A has a bluff). This strongly suggests that the situation is that A has bet. If so, then .2 is NOT the prob that A has the nuts (unless A bluffs every hand), nor is x the prob that A has bluffed. The errata is incomplete to my way of thinking. It was very confusing to me as written.

If the equations were conditioned on A having bet then the "answer" for the bluffing frequency that would make B indifferent to calling or folding would be P(A has nuts)=.2 (not .04), which corresponds to the odds B is receiving, 5 to 1. In other words this would be an example of Sklansky's principle that your bluffing frequency should equal the reciprocal of the odds you are giving.
Reply With Quote
  #383  
Old 06-10-2007, 11:19 PM
mwette mwette is offline
Member
 
Join Date: Mar 2007
Posts: 71
Default Re: question about the book and the CLT

[ QUOTE ]
[ QUOTE ]
For example mamy of the problems in finance, gambling, and operations research and control theory turn out to be very similar,...

[/ QUOTE ]

Would you expand on this a little? What does control theory have in common with game theory? TIA.

[/ QUOTE ]

It makes sense (to me) to treat poker as a problem in
stochastic control theory. The control system is the hero
with controller (hero's betting strategy) and estimator
(to determine villians' card distributions). The system
under control is the rest of the game (board, pot, stacks,
the villians' cards and other players strategies). The
control signal is the bets being placed (by the hero). The
measurements are common cards, pot size, stack sizes, and
bets being placed (by the villians). The controller is
tuned to provide the maximum EV for the hero.

The field of H-infinity control, which has been around for
about 25 years, has lots of connections to minimax
optimization and game theory, I believe. If you want more
on the game-theory / control-theory connection you might
try wikipedia.org.
Reply With Quote
  #384  
Old 07-01-2007, 05:28 PM
bbbushu bbbushu is offline
Senior Member
 
Join Date: Jul 2006
Location: it\'s [censored] or walk
Posts: 1,673
Default Bump

Chen, Ankenman, others,

***WARNING*** I don’t have a very strong math background/mind so this question’s answer might be obv to everybody else but I’m asking it anyways, thanks for not flaming. I also don’t know how to write some of the notation, but anybody with the book can figure out what exactly I’m saying. Also sorry for the mega-bump. ***WARNING***

Chapter 16 – [0,1] Game #8: The Raising Game with Check-Raise

[…]

The optimal strategy for Player Y is something like:

[ QUOTE ]
Yn = r^n / (1 – r )

[/ QUOTE ] p. 191

My question is: what does r stand for?

There is an example given which says:

[ QUOTE ]
Y1 = 1 / (1 – r), and Y puts in the (n + 1)th bet with a fraction of the hands with which he put in the nth bet equal to r.

[/ QUOTE ] p. 191

*Perhaps because I just don’t understand r, I don’t know how to apply this example for Y2, Y3, Yn … – 1 is usually different a lot of the time because it equals itself so often.

*I know that r for the (n + 1)th bet is always a number closer to 0 than r for (n + 1) – I understand that each bet requires a stronger hand (while I’m unsure if my explanation makes sense…).

*I think that r is whatever fraction Y puts in the nth bet with, but I don’t understand how to determine it’s increase for the (n + 1)th bet.

*It’s explained that this equation differs from the previous game (where no check/raise was allowed, only raising) in that Y only need to know their thresholds in relation to their own other thresholds (Y1, Y, Y3, Yn) as opposed to in relation to X2, X4, Xn and that’s what factors of r separates. You say:

[ QUOTE ]
Yn = rY(n – 1)

[/ QUOTE ]

… I am not sure if this second equation is Y’s total strategy, or another strategy (doubtful), etc.

An explanation/criticism of what I wrote would be excellent. A non-1 example (or two) would be awesome!


Thanks,
bbbushu

p.s. I will almost definitely also ask these rudimentary questions about X’s strategy for this game after this part makes sense to me [img]/images/graemlins/wink.gif[/img]
Reply With Quote
  #385  
Old 07-02-2007, 09:36 AM
parachute parachute is offline
Member
 
Join Date: Jul 2004
Location: Boston, MA
Posts: 63
Default Re: Bump

r is sqrt(2) - 1, as defined previously in the book. It's referred to more explicitly in the appendix to that chapter where the results you cite are derived.
Reply With Quote
  #386  
Old 07-02-2007, 02:42 PM
bbbushu bbbushu is offline
Senior Member
 
Join Date: Jul 2006
Location: it\'s [censored] or walk
Posts: 1,673
Default Re: Bump

so r always = sqrt(2) - 1 (e.g. 'golden mean of poker')?

that was my guess (i read the appendix to help but still wasn't sure - i must have missed it earlier.

thanks
bbbushu
Reply With Quote
  #387  
Old 07-05-2007, 11:16 PM
Shroomy Shroomy is offline
Senior Member
 
Join Date: Apr 2006
Location: Miami FLA
Posts: 465
Default Re: Sklansky-Chubukov and The MoP

HelixTrix ...

Good post and many of my thoughts too...
but your point of opponent being unlikely of calling with anything in the lower range of the optimal stategy argues for jamming with more hands than optimal not less.

And its pretty much been proven (or at least not disproven) that jam or fold is at minimum very close the best exploitive stategy.

The only real argument is if it applies to SB vs BB due to card clumping. (aka.. if 8 players folded the BB is more likely to have a above average hand)
Reply With Quote
  #388  
Old 08-21-2007, 09:56 AM
notreallymyname notreallymyname is offline
Senior Member
 
Join Date: Dec 2006
Location: NL100
Posts: 123
Default Re: Sklansky-Chubukov and The MoP

I think I've found an incorrect conclusion on top of an incorrect derivation, could someone confirm?

On p.117 (Example 11.3, half-street [0,1] with folding) the following is given as X's equity for a strategy which calls [0,x_1*] when Y bets [0,y_1] in a pot of P bets:

<X> = (P + 1)(y_1 - x_1*) - 1(x_1*)

It should be immediately obvious that this is wrong because the expression (y_1 - x_1*) is meant to be the probability of calling and winning but isn't guaranteed to be in the range [0,1]. Even for x_1* < y_1 this is wrong because X not only wins when Y is betting [x_1*,y_1] but half of the rest of the time too. (The probability of calling and losing in this equation is similarly flawed.) The third inequality below this has been accidentally negated too.

The conclusion that X needs a 1/(P + 1) chance of having the best hand to profitably call is right, but the threshold for this is not at y_1(P + 1)/(P + 2) as stated.

Some of the following work on optimal strategy for this game also makes the assumption that X always loses when Y's hand is in his calling range. The whole section needs to be checked really, I'm just going to skip it after finding such an obvious systematic error, it would just be confusing to do otherwise.

Can I really be the only one who got this far?
Reply With Quote
  #389  
Old 08-22-2007, 01:07 AM
notreallymyname notreallymyname is offline
Senior Member
 
Join Date: Dec 2006
Location: NL100
Posts: 123
Default Re: Sklansky-Chubukov and The MoP

Ah, never mind, this is correct (although the final inequality is still wrong), just unnecessarily confusing.
Reply With Quote
  #390  
Old 10-13-2007, 05:13 PM
invrnrv invrnrv is offline
Junior Member
 
Join Date: Sep 2007
Posts: 2
Default another error

page 97, there should be 4 bets in the pot, not 3. 3 bets make the optimal bluffing frequency 1/16, not 1/20 (5%). 4 bets make it 5%.
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 04:42 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.