6.1 Expected Value and Risk

In this assignment, you’ll learn about the probability distribution of a random variable and how to use it to calculate the expected value of that variable. Next, I’ll introduce expected utility, which replaces “value” with “utility”—a measure of the decision-maker’s satisfaction. Finally, you’ll explore how economists think about people’s decision-making under uncertainty: people can be risk-averse (demanding compensation to take risks), risk-neutral (indifferent to risk), or risk-seeking (drawn to the potential for high rewards despite lower expected returns).

For example: let’s consider a lottery where there is a 1% chance to win $100 and a 99% chance to win $0.

The expected value (EV) is the long-run average payoff if the scenario repeats many times. Calculate it by multiplying each payoff by its probability and summing the results:

\[EV(\text{lottery}) = 0.01 (100) + .99 (0) = 1\]

This means if you were to participate in the lottery many times, on average you should expect to win $1 per round (or if you play 100 times, you should expect to win the $100 once).

How much would you pay for the lottery ticket?

Preferences vary:

  • Risk-averse? You’d pay less than $1—you avoid risk unless the EV exceeds the cost.
  • Risk-neutral? You’d pay up to exactly $1—you care only about the net expected value.
  • Risk-seeking? You’d pay more than $1—you accept lower expected returns for a chance at a big win.

Consider lotteries like Powerball: a ticket costs $2, and there is about a 1 in 11.7 million chance of winning the $1 million dollar prize. That makes the expected value equal to 1,000,000 * (1 / 11,700,000) = 8.5 cents: nowhere close to the ticket cost of $2. Since the EV is far below the ticket price, buying it suggests risk-seeking behavior.

Expected Value in a Coin Flip Game

When flipping a fair coin:

  • Probability of Heads: 0.5
  • Probability of Tails: 0.5
Outcome Probability
Heads 0.5
Tails 0.5

Expected value (EV) combines probabilities and payoffs to measure the long-run average outcome if the scenario repeats.

Suppose you play a game where:

  • You win $1 on Heads
  • You lose $1 on Tails
Outcome Probability Payoff
Heads 0.5 +1
Tails 0.5 -1

Calculating the EV:

\[E = 0.5(1) + 0.5(-1) = 0\]

The EV of $0 means you don’t gains an advantage in the long run—winnings and losses balance out.

Question 1

If you’re playing another coin flip game where you get $1 for heads and lose $1 for tails, but you know that the coin is biased and lands on heads 60% of the time, how much should you expect to earn in the long-run per round?

Product Launch

Suppose market research indicates that if a company launches a new product, there are 3 possible outcomes: low demand, medium demand, and high demand, which occur with these probabilities:

Outcomes Probabilities
Low Demand 0.4
Medium Demand 0.4
High Demand 0.2

Question 2

  1. Calculate the expected value for the company if low demand generates $0 of revenue, medium demand generates $10K of revenue, and high demand generates $40K of revenue.

  2. If the product launch costs $15K, should the company expect to make money in the end?

Expected Value of Perfect Information

In many situations, managers have the option to gather additional information before making a decision, but this information often comes at a cost. The concept of the expected value of perfect information helps determine how much a decision maker should be willing to pay for information that eliminates uncertainty.

Question 3

Suppose a company is deciding whether to expand into a new market. They believe there is a 60% chance of high demand (leading to $10K profit) and a 40% chance of low demand (leading to $15K loss in profit).

  1. Calculate the expected value of expanding into the new market.

  2. Now suppose the company can pay for perfect market research that would tell them with certainty whether demand will be high or low before deciding to expand. With this perfect information, they would expand if demand is high and not expand if demand is low. Show that the expected value with perfect information is $6K.

  3. If the cost of gaining the market research is $10K, would it be worth it for the firm?

  4. If the cost of gaining the market research is $5K, would it be worth it for the firm?

  5. What is the maximum the company would pay for the perfect market research?

Expected Utility and Risk Attitudes

Question 4

Consider another company’s new product launch:

Outcome Probability Payoff Utility
Low Demand 0.5 0 ___
High Demand 0.5 $100K ___
  1. Find the expected value of launching the new product.

  2. Consider 2 different managers with different utility functions: one is “risk averse” with a utility function \(U(\text{payoff}) = \log_{10}(\text{payoff})\). The other is “risk neutral” with a utility function \(U(\text{payoff}) = \text{payoff}\). Fill out the “utility” column in the table above for each of these managers.

  3. Calculate the expected utility of the new product launch for each of the managers in part B. Expected utility is the same idea as expected value, but instead of multiplying probabilities with payoffs, you’ll multiply probabilities with utilities.

Question 5

A company is considering 3 new products:

  • Product A has a 50% chance of earning 0 and a 50% chance of earning $7
  • Product B has a 1/3 chance of earning 0 and a 2/3 chance of earning $6
  • Product C has a 100% chance of earning $1
  1. Which product will the risk averse manager choose to launch? That is, given the utility function \(U(x) = log_{10} (x)\), calculate the expected utilities for each product and indicate which one generates the highest.

  2. Which product will the risk neutral manager choose to launch (use \(U(x) = x\))? Calculate the expected utilities for each product and indicate which one generates the highest.

  3. Which product will the risk seeking manager choose to launch (use \(U(x) = x^2\))? Calculate the expected utilities for each product and indicate which one generates the highest.

Question 6

  1. What is the expected value of perfect information, and why is it useful in business decisions? Under what circumstances would it be worth paying for more information before making a choice?

  2. Why do economists use utility instead of just dollar amounts when analyzing choices under uncertainty? What does this tell us about how people actually make decisions?