#1
|
|||
|
|||
Standard Deviation Question
Let's say you expect to win a certain bet 60% of the time. You would lose 40% of the time. To find out your SD for 10 bets you would take the square root of (10 x .6 x .4)= 1.549
You would then multiply this times 3 for for 3 SDs = 4.67 Now to find your range of winners you would subtract 4.67 form 6 and add 4.67 to 6 getting 1.33 and 10.67. Within 3 SDs you could expect to win this bet anywhere from 1.33 to 10.67 times out of ten. My question is how can you get a number over 10? I'm taking this straight from a book. It works out OK for larger sample sizes, but when I tried it for smaller ones this is what happens. I guess I don't understand what is happening here. Did I do the calculations correctly? Thanks for any explanation. |
|
|