Probability Theory Series (Part 3): Expectation and Variance

Renda Zhang
7 min readDec 22, 2023

--

Before we delve deeper into the mysteries of probability theory, let’s briefly revisit our previous journey. In “Probability Theory Series (Part 2): Random Variables and Probability Distributions,” we unveiled the mystery of random variables and explored the diverse world of probability distributions. We learned how to quantify and describe the possible outcomes of random events and the probabilities of these outcomes. Now, we advance our journey to new heights — into the realm of expectation and variance.

Expectation and variance are two immensely important concepts in probability theory, serving as key tools for understanding and analyzing random phenomena. Expectation, simply put, is the “average” or “expected” outcome of a random variable. It provides us with a central standard about the behavior of a random variable, helping us understand what average result we can expect over many repeated experiments. Variance, on the other hand, further deepens this understanding. It measures the degree of deviation of a random variable from its expected value and quantifies the uncertainty of the random variable. In short, variance tells us the extent to which the values of a random variable fluctuate around its average value.

These concepts are not only theoretically significant in mathematical studies but also vitally important in practical applications. Whether predicting market fluctuations in financial risk assessment or interpreting data variability in social science research, expectation and variance are indispensable tools. Through this article, you will gain a deeper understanding of these concepts and learn how to apply them in practical scenarios.

At the end of the article, we will briefly mention some related concepts that are relevant to the topic but not explored in depth. This will set the stage for our next article, “Probability Theory Series (Part 4): Conditional Probability and Bayes’ Theorem,” making our journey through probability theory more complete and enriching. Now, let us embark on this fascinating chapter of expectation and variance in probability theory!

The Concept of Expectation (Expectation)

Expectation is a fundamental concept in probability theory, representing the average outcome we can anticipate over a series of random events. This concept helps us understand and predict the average behavior of random variables.

Defining Expectation

  • For discrete random variables, the expectation is the probability-weighted sum of all possible outcomes. In other words, it’s the sum of each outcome multiplied by its probability of occurrence. For example, if we have a random variable X with possible values x1, x2, …, xn, and the probabilities of these values occurring are P(X = x1), P(X = x2), …, P(X = xn), then the expectation E(X) is the sum of Σ (xi * P(X = xi)), which means multiplying each possible result by its corresponding probability and then adding them up.
  • For continuous random variables, expectation is the integral of the random variable’s probability density function over its possible values. Specifically, if X is a continuous random variable with a probability density function f(x), then the expectation E(X) is the integral of x * f(x) dx, integrating over all possible values of x multiplied by their density function.

Examples of Calculating Expectation

  • Take rolling a die as an example. The probability of each face appearing is 1/6. Therefore, the expectation of rolling a die is 1 * 1/6 + 2 * 1/6 + 3 * 1/6 + 4 * 1/6 + 5 * 1/6 + 6 * 1/6, which equals 3.5. This means that over many repetitions of rolling the die, the average result tends to be 3.5.

Applications of Expectation

  • Expectation is widely used in real-life applications. For instance, in the insurance industry, expectation is used to assess risk and set premiums. In the finance sector, investors use expectation to evaluate the potential returns on investments.

Understanding expectation allows us to better grasp the average behavior and trends of random events. In the next section, we will explore variance, another key tool for understanding the variability of random variables.

The Concept of Variance (Variance)

Variance is another crucial concept in probability theory and statistics. It measures the dispersion of a random variable or a set of data, specifically how they spread around their mean (expected value).

Defining Variance

  • For discrete random variables, variance is the probability-weighted average of the squared differences between each possible outcome and the mean. If we have a random variable X with an expected value E(X), then the variance Var(X) is expressed as Σ [(xi — E(X))² * P(X = xi)]. This formula calculates the square of the deviation of each possible result from the mean, multiplies it by the probability of that result, and then sums these values.
  • For continuous random variables, variance is the integral of the square of the difference between the values of the random variable and its expected value, weighted by its probability density function. Mathematically, this is described as Var(X) = ∫ (x — E(X))² * f(x) dx, where f(x) is the probability density function of X.

Practical Implications of Variance

  • Variance provides a measure of the volatility or instability of a random variable. A high variance indicates that the data points are spread far from the mean, while a low variance indicates that they are more concentrated.

Applications of Variance

  • In finance, variance is commonly used to measure the risk of asset returns. A high variance indicates that the returns of an investment might fluctuate greatly, whereas a low variance indicates more stability in returns.
  • In scientific research, variance helps researchers understand the reliability and consistency of a data set.

Understanding variance helps us better comprehend and explain the variability in data and how this variability affects our decisions and predictions. It is a powerful tool that enables us to extract insights from randomness.

In the following section, we will explore some of the mathematical properties of expectation and variance, and how they work together to reveal the characteristics of random variables.

Properties of Expectation and Variance

Understanding the properties of expectation and variance is crucial for a deeper comprehension of the behavior of random variables. These properties are not only theoretically interesting but also immensely useful in practical applications.

Properties of Expectation

  1. Linearity: For any two random variables X and Y, and constants a and b, the expectation of their linear combination follows E(aX + bY) = aE(X) + bE(Y). This implies that the operation of expectation can be distributed across each random variable and constant.
  2. Expectation of a Constant: The expectation of any constant c is the constant itself, i.e., E(c) = c.
  3. Product of Independent Random Variables: If X and Y are independent, then E(XY) = E(X)E(Y).

Properties of Variance

  1. Non-Negativity: Variance is always non-negative, i.e., Var(X) ≥ 0. This is because variance is the average of squared deviations, and squares are always non-negative.
  2. Variance of a Constant: The variance of any constant c is zero, i.e., Var(c) = 0.
  3. Linear Transformation: For any constants a and b, Var(aX + b) = a² Var(X). This shows that adding or subtracting a constant does not change the variance, but multiplying by a constant scales the variance by the square of that constant.

Relationship Between Expectation and Variance

  • Although expectation and variance measure different aspects, they are closely related. Variance can be seen as a measure of how much a random variable deviates from its expected value. More specifically, variance can be expressed through expectation: Var(X) = E[(X — E(X))²].

These properties allow us to better understand and analyze the behavior of random variables. They provide us with different perspectives to interpret random variables and lay the groundwork for analyzing more complex probability problems.

Next, we will briefly introduce some advanced applications of expectation and variance in complex statistical problems, providing a broader perspective for our series on probability theory.

Advanced Applications

Expectation and variance are not only essential in foundational probability theory; they play a central role in solving more complex statistical and data analysis problems. Let’s explore some of these advanced applications.

Risk Assessment and Financial Modeling

  • In the financial sector, expectation and variance are key tools for assessing the risk and return of investment portfolios. Expectation is used to estimate the expected return of assets, while variance or its square root (standard deviation) is used to measure the volatility or risk of asset returns. In modern portfolio theory, these concepts are used to optimize asset allocation to balance risk and return.

Experimental Design and Data Analysis

  • In scientific research and experimental design, expectation and variance help researchers understand the central tendency and spread of data. For example, in clinical trials, comparing the expectation and variance of the effects of different treatment methods can lead to significant conclusions about treatment efficacy.

Quality Control and Process Monitoring

  • In industrial engineering and quality control, expectation and variance are used to monitor production processes and product quality. Measuring the expectation and variance of key parameters in the production process can detect production anomalies in time, allowing for corrective actions.

Predictive Models and Machine Learning

  • In predictive modeling and machine learning, expectation and variance are important indicators for evaluating model performance. For instance, in regression analysis, expectation helps determine the accuracy of model predictions, while variance is used to assess the stability and generalizability of the model.

These applications illustrate that expectation and variance are not just theoretical concepts but powerful tools for solving real-world problems. They enable us to quantify uncertainty and make more informed decisions when facing complex data.

Conclusion

Expectation and variance provide us with a powerful framework to understand and interpret random phenomena. With these concepts, we can extract meaningful information from randomness, whether in everyday life or in complex scientific and engineering problems.

In the next article, “Probability Theory Series (Part Four): Conditional Probability and Bayes’ Theorem,” we will explore how to assess the probability of events given certain information and how Bayes’ theorem plays a role in modern statistics and data science. Stay tuned!

In this article, we focused on the basic concepts and applications of expectation and variance. However, there are other related but not deeply explored important concepts, such as covariance, which measures the degree of correlation between two random variables, and the Central Limit Theorem, a powerful theorem stating that under certain conditions, the sum of independent random variables tends towards a normal distribution. These concepts will be further explored in our future articles.

--

--

Renda Zhang
Renda Zhang

Written by Renda Zhang

A Software Developer with a passion for Mathematics and Artificial Intelligence.

No responses yet