PDA

View Full Version : A Criticism of the Kelly Criterion


jason1990
04-17-2006, 01:01 PM
How much of my bankroll should I bet on a +EV game? If I want to maximize my EV, then I should bet it all. But if I do that repeatedly, then I will eventually go broke.

So what's wrong with going broke, you might ask. Well, let's assume for the moment that I don't want to go broke. Then what fraction, f, of my bankroll, B, should I bet on a +EV game? Since I don't want to go broke, f must be less than one.

I want to find an f which is "optimal" in some sense. Let's start with an assumption: f does not depend on B. Now, let's define "optimal". Suppose I choose f_1 and you choose f_2. We start with the same bankroll and play the same game. We then compute the probability that I have more money than you after N trials. Call this p(N). If p(N) tends to one as N goes to infinity, then we will say that f_1 is a better choice than f_2. A number f will be optimal if, for all g, f is a better choice than g.

This optimal f is given by the Kelly Criterion. It is the choice which maximizes the longterm growth rate of my bankroll in the sense described above.

So what's wrong with this? Let's return to the question, what's wrong with going broke? When we apply the Kelly Criterion, we are assuming that going broke is an absolute disaster. If we have a 99.9% chance of ending with 2B and a 0.1% chance of ending with 0, then we will not take that chance. Going broke is the end of everything. In some sense, then, the "distance" between 0 and B is much greater than the "distance" between B and 2B. This is nothing new. This is just the concept of utility.

In fact, the Kelly Criterion assumes the existence of a particular utility function, namely U(x) = log(x). The Kelly Criterion is just telling you to bet in a way that maximizes your expected utility, that is, you are maximizing U(B) = log(B). Is there anything wrong with this? No, of course not. Utility functions are very natural to use.

But the Kelly criterion is supposed to give an "optimal" betting scheme. Hence, U(x) = log(x) ought to be an "optimal" utility function. That is, if I have any other utility function and I bet in a way which maximizes my expected utility, then I will not be using the Kelly Criterion, so I will be performing "suboptimally".

On the surface, this seems convincing. But why is the Kelly Criterion supposed to be "optimal"? Because it maximizes the longterm growth rate of my bankroll. BUT HERE'S THE PROBLEM: if I'm using a utility function, then I don't care about the growth of my bankroll, B. I care about the growth of U(B).

It seems pretty obvious that in order to maximize the growth of U(B), I simply need to maximize, at each step, the expectation of U(B). This is true no matter what utility function I use. So there's no reason to choose U(x) = log(x) over anything else. I should just use whatever utility function best describes my actual preferences. If I then maximize my expected utility, I will automatically maximize the longterm growth rate of the utility of my bankroll.

In summary, the Kelly Criterion assumes the existence of a utility function, but then proceeds to maximize growth of the raw bankroll, not the utility of the bankroll. If we modify this to correct the inconsistency, then we find that the Kelly system does not "outperform" other systems. It is just one of many possible systems. You should only use it if the utility function U(x) = log(x) is a good approximation of your true preferences.

(NOTE: I have phrased this as a criticism of the Kelly Criterion, but it is perhaps better described a criticism of how some people apply it. Some people view the Kelly Criterion as providing a betting scheme which is objectively "best" in some sense. These people would claim that using any other scheme is objectively less desirable than using the Kelly system. The above argument shows why this is false.)