# What is the expected value really?

Expected value is one of the most fundamental concepts in probability theory and machine learning.

Have you ever wondered what it really means and where does it come from? The formula doesn't tell the entire story right away.

Let's unravel what is behind the scenes!

$\displaystyle \mathbb{E}[x] = \sum_{i=1}^{n} x_i P(X = x_i)$

First, let's take a look at a simple example. Suppose that we are playing a game. You toss a coin, and

• if it comes up heads, you win $1, • but if it is tails, you lose$2.

Should you even play this game with me? 🤔

After 𝑛 rounds, your earnings can be calculated by the number of heads times 1 minus the number of tails times 2. If we divide total earnings by $\textstyle n$, we obtain the average earnings per round. What happens if $\textstyle n$ approaches infinity?

$\displaystyle \text{average earnings} = 1 \times \frac{\# \text{heads}}{n} - 2 \times \frac{\# \text{tails}}{n}$

As you have probably guessed, the number of heads divided by the number of tosses will converge to the probability of a single toss being heads. In our case, this is $\textstyle 1/2$. (Similarly, tails/tosses also converges to $\textstyle 1/2$.)

$\displaystyle \frac{\# \text{heads}}{n} \to P(\text{heads}) = \frac{1}{2} \quad (n \to \infty)$

So, your average earnings per round are $\textstyle -1/2$. This is the expected value. (By the way, you definitely shouldn't play this game.) How can we calculate the expected value for a general case?

Suppose that, similarly to the previous example, the outcome of your experiments can be quantified. (Like throwing a dice or making a bet at the poker table.) The expected value is just the average outcome you have per experiment when you let it run infinitely.

The formula above is simply the expected value in English. If we formally denote the variable describing the outcome of the experiment with 𝑋 and its possible values with 𝑥ᵢ, we get back the formula in the beginning.

It looks much easier now, isn't it?