In general **expectation** is what is considered the most likely to happen. A less advantageous result gives rise to the emotion of **disappointment**. If something happens that is not at all expected it is a surprise.

In probability (and especially gambling), the **expected value** (or **expectation**) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff ("value"). Thus, it represents the average amount one "expects" to win per bet if bets with identical odds are repeated many times. Note that the value itself may not be expected in the general sense, it may be unlikely or even impossible.

For example, an American Roulette wheel has 38 equally possible outcomes. A bet placed on a single number pays 35-to-1 (this means that he is paid 35 times his bet, while also his bet is returned, together he gets 36 times his bet). So the expected value of the profit resulting from a $1 bet on a single number is, considering all 38 possible outcomes: ( -1 × 37/38 ) + ( 35 × 1/38 ), which is about -0.0526. Therefore one expects, on average, to lose over 5 cents for every dollar bet.

In general, if *X* is a random variable defined on a probability space (Ω, *P*), then the **expected value** E*X* of *X* is defined as

If *X* is a discrete random variable with values *x*_{1}, *x*_{2}, ... and corresponding probabilities *p*_{1}, *p*_{2}, ... which add up to 1, then E*X* can be computed as the sum or series

If the probability distribution of *X* admits a probability density function *f*(*x*), then the expected value can be computed as

**expectation operator**) E is linear in the sense that

- E(
*aX*+*bY*) =*a*E*X*+*b*E*Y*

*X*and

*Y*(which need to be defined on the same probability space) and any two real numbers

*a*and

*b*.

The expected values of the powers of *X* are called the *moments* of *X*; the moments about the mean of *X* are also defined as certain expected values.

In general, the expected value operator is not multiplicative, i.e. E(*XY*) is not necessarily equal to E*X* E*Y*, except if *X* and *Y* are independent. The difference, in the general case, gives rise to the covariance and correlation.

To empirically estimate the expected value of a random variable, one repeatedly measures values of the variable and computes the arithmetic mean of the results. This estimates the true expected value and has the property of minimizing the sum of the squares of the errors away from the expected value.