To win in sports betting you need a betting strategy with a positive expected value, i.e. an estimation of your average winnings per bet. But how much capital should you risk per bet to achieve maximum profits? For this, you need to understand the concept of utility. Read on to find all about it.
Expected value, a concept first explored by French mathematicians Pascal and Fermat in the 17th century when trying to solve the problem of a game of points, shows us how much we can expect to win, on average, from a bet. It doesn’t, however, have very much to say about how much capital a bettor should risk on their bet. Here is where expected utility comes into play.
Expected value (EV) in betting can be calculated by multiplying your probability of winning (p) with the amount you could win per bet, and subtracting the probability of losing multiplied by the amount lost per bet. Since the probability of losing is equivalent to 1 (or 100%) minus the probability of winning, we arrive at the following simplification:
‘o’ represents the European decimal odds made available by the bookmaker. Expected value is the most important number for any bettor, for it informs them about whether they can expect to make or lose money in the long run.
Once the bettor has found the expected value they must decide how much of their capital to bet. The 18th century mathematician Daniel Bernoulli understood that only the foolhardy make decisions about how much to risk based on the objective expected value without regard to the subjective consequences of the bet, that is to say the desirability of what is to be gained (or lost). This subjective desirability is known as utility.
We are presented with two chests. The first one contains $10,000 in cash. The second chest contains either $20,000 in cash or nothing; we are unsure which but each option is equally likely. You are now asked to take one of the chests. Which one would you choose?
This is a classic utility puzzle. Mathematically, both of these chests have the same expected value, that is to say, $10,000. If you could repeat this game over and over again forever, it would make no difference which chest you picked. However, in this game you are only allowed to play once. The law of large numbers does not apply.
If you take the first chest, you are certain to gain $10,000. If you choose the second, what you receive is a matter of chance: be lucky and you’ll be $20,000 richer; unlucky, and you’ll receive nothing. Unsurprisingly, given these sums of money, most people choose the certainty of the first chest.
From a utility perspective, the certainty of $10,000 is surely a lot better than the risk of receiving nothing. People who find greater utility in certainties than in gambles with the same mathematical expectation are demonstrating aversion to risk.
Daniel Bernoulli reasoned that the standard rational behaviour of people when making decisions under uncertainty is risk aversion. He quantified his hypothesis thus: “the utility resulting from any small increase in wealth will be inversely proportionate to the quantity of goods previously possessed.” In other words, the more wealth you already possess, the less utility you will perceive from gaining more. Such a utility function is logarithmic, and more commonly known as the diminishing marginal utility of wealth.
Although using the Kelly Criterion can cause significant volatility in returns, it enables winning bettors to maximize their bankroll over the long run.
One of the more practical applications of Daniel Bernoulli’s theory is a money management plan known to many bettors as the Kelly Criterion. Developed by John Kelly while working at AT&T's Bell Labs in 1956 on solving a problem concerning long distance telephone noise, it was quickly adopted by gamblers and investors as a means of optimising money management and profit growth.
Whilst Kelly’s motivation was entirely different to Bernoulli’s, his criterion was mathematically equivalent to the logarithmic utility function. Practically, it directs a bettor to risk a percentage of his overall wealth on a wager that is both directly proportional to the expected value (EV) and inversely proportional to the probability of success.
Recalling that EV = po – 1 (where p is the ‘true’ probability of success and o the decimal odds for the wager), we can calculate the Kelly stake percentage (K) as follows:
Essentially, the Kelly criterion maximises expected logarithmic utility. One consequence of betting with the Kelly Criterion is a significant volatility in returns, a feature that may not best serve everyone’s utility. Furthermore, its use does require precise estimations of the ‘true’ probabilities of outcomes.
Nevertheless, Kelly’s approach does technically enable winning bettors to maximise the size of their bankroll over the long term. Of course, to do so a bettor needs a bookmaker that will not be suspicious of specific money management strategies like Kelly and more importantly, will not restrict betting as a consequence of winning.