doucy
07-04-2006, 11:17 AM
The following equation is used in conjuction with a 2 player game. At the beginning of the game, each player has a rating that reflects his skill (usually somewhere between 100 and 2000; the higher the rating, the better the player.) This equation is used to determine, at the end of the game, the adjustment to each player's rating based on the outcome of the game (you win the game, your rating increases, you lose, your rating decreases.)
<font color="white"> ________________</font> 1
r1 + k * (w - --------------------)
<font color="white"> ___________</font> 1 + 10^((r2-r1)/400)
r1 = your pre-game rating
r2 = your opponent's pre-game rating
let w = 1 if you win, .5 if you draw, and 0 if you lose
<font color="white"> ________ </font> |d|
k = 20 + ----- where d = score difference, with a max value of 200.
<font color="white"> ________</font> 10
After observing the equation, you can see that the post-game rating adjustment is determined by 2 things: the margin of victory, and the difference in pre-game ratings. The greatest possible post-game rating adjustment is 40 points.
Question: You play a game, you lose, and your rating decreases by 40 points (the greatest possible number.) How many games will you have to play before that game will no longer have any effect on your rating? Assume that each subsequent game you play will be against a player with a rating equivalent to yours.
Another way to think of the question: The formula is such that as you play more and more games, the importance of previous games decreases. If you played a million games, the very first game likely would have no effect on your rating at all (i.e. regardless of if you won, lost, or tied that first game, your present rating would be the same.) However, the last game you played could have an effect up to 40 points. I am trying to figure out how many previously played games have an effect on a player's rating, under the assumption that the player is always competing against opponents whose ratings are identical to his own.
<font color="white"> ________________</font> 1
r1 + k * (w - --------------------)
<font color="white"> ___________</font> 1 + 10^((r2-r1)/400)
r1 = your pre-game rating
r2 = your opponent's pre-game rating
let w = 1 if you win, .5 if you draw, and 0 if you lose
<font color="white"> ________ </font> |d|
k = 20 + ----- where d = score difference, with a max value of 200.
<font color="white"> ________</font> 10
After observing the equation, you can see that the post-game rating adjustment is determined by 2 things: the margin of victory, and the difference in pre-game ratings. The greatest possible post-game rating adjustment is 40 points.
Question: You play a game, you lose, and your rating decreases by 40 points (the greatest possible number.) How many games will you have to play before that game will no longer have any effect on your rating? Assume that each subsequent game you play will be against a player with a rating equivalent to yours.
Another way to think of the question: The formula is such that as you play more and more games, the importance of previous games decreases. If you played a million games, the very first game likely would have no effect on your rating at all (i.e. regardless of if you won, lost, or tied that first game, your present rating would be the same.) However, the last game you played could have an effect up to 40 points. I am trying to figure out how many previously played games have an effect on a player's rating, under the assumption that the player is always competing against opponents whose ratings are identical to his own.