Two Plus Two Newer Archives  

Go Back   Two Plus Two Newer Archives > General Gambling > Probability
FAQ Community Calendar Today's Posts Search

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 11-13-2007, 07:26 PM
Crane Crane is offline
Senior Member
 
Join Date: Jun 2007
Posts: 139
Default Standard Deviation Question

Let's say you expect to win a certain bet 60% of the time. You would lose 40% of the time. To find out your SD for 10 bets you would take the square root of (10 x .6 x .4)= 1.549

You would then multiply this times 3 for for 3 SDs = 4.67

Now to find your range of winners you would subtract 4.67 form 6 and add 4.67 to 6 getting 1.33 and 10.67.

Within 3 SDs you could expect to win this bet anywhere from 1.33 to 10.67 times out of ten.

My question is how can you get a number over 10? I'm taking this straight from a book. It works out OK for larger sample sizes, but when I tried it for smaller ones this is what happens. I guess I don't understand what is happening here. Did I do the calculations correctly?

Thanks for any explanation.
Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -4. The time now is 06:19 PM.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.