➕ Combining Random Variables (Sums and Differences)
Often in statistics, we work with linear combinations of random variables. For example, total profit might be the sum of profits from two stores, or net gain might be a difference. Understanding how means and variances combine under these operations is critical for inference and prediction.
Linear Transformations: aX+b
For a random variable X with mean μ and standard deviation , any linear transformation has:
📚 Practice Problems
1Problem 1easy
❓ Question:
A linear transformation is Y=3X−5. If μ and , find and .
Explain using:
⚠️ Common Mistakes: Combining Random Variables (Sums and Differences)
What is Combining Random Variables (Sums and Differences)?▾
Apply rules for the mean and variance of sums/differences of independent random variables, including linear transformations aX + b.
How can I study Combining Random Variables (Sums and Differences) effectively?▾
Start by reading the study notes and working through the examples on this page. Then use the flashcards to test your recall. Practice with the 3 problems provided, checking solutions as you go. Regular review and active practice are key to retention.
Is this Combining Random Variables (Sums and Differences) study guide free?▾
Yes — all study notes, flashcards, and practice problems for Combining Random Variables (Sums and Differences) on Study Mondo are 100% free. No account is needed to access the content.
What course covers Combining Random Variables (Sums and Differences)?▾
Combining Random Variables (Sums and Differences) is part of the AP Statistics course on Study Mondo, specifically in the Unit 4: Probability, Random Variables, and Probability Distributions section. You can explore the full course for more related topics and practice resources.
X
σX
Y=aX+b
E(Y)=E(aX+b)=aE(X)+b=aμX+b
Var(Y)=Var(aX+b)=a2Var(X)=a2σX2
σY=∣a∣σX
Key insight: Adding/subtracting a constant shifts the mean but does NOT change the variance or standard deviation. Multiplying by a constant a scales both mean and variability; the standard deviation scales by ∣a∣.
Sums and Differences of Two Independent Variables
For independent random variables X and Y:
E(X+Y)=E(X)+E(Y)=μX+μY
E(X−Y)=E(X)−E(Y)=μX−μY
Var(X+Y)=Var(X)+Var(Y)=σX2+σY2
Var(X−Y)=Var(X)+Var(Y)=σX2+σY2
Note: Variance adds for both sums and differences (the variance of X−Y is the same as for X+Y).
These rules require independence. If variables are dependent (e.g., high values of X tend to occur with high values of Y), the variance formula must account for covariance, which is beyond the AP Statistics scope.
Worked Example 1: Linear Transformation
A store's daily revenue is R=50P+200, where P is the number of items sold. Suppose E(P)=100 and σ(P)=20.
Find E(R) and σ(R):
E(R)=50⋅100+200=5000+200=5200
σ(R)=∣50∣⋅20=1000
The mean revenue is 5200perday,withstandarddeviationof1000. Multiplying by 50 scaled the variability proportionally.
Worked Example 2: Sum of Independent Variables
Two independent vending machines have daily revenues: Machine 1 with μ1=100, σ1=15; Machine 2 with μ2=120, σ2=20. Let T=R1+R2 be the total.
Mean of sum:E(T)=100+120=220
Variance of sum:Var(T)=152+202=225+400=625
Standard deviation of sum:σ(T)=625=25
Even though Machine 2 alone is more variable (σ2=20 vs. σ1=15), the combined standard deviation is smaller than the sum of individual SDs (15+20=35) because variance adds, not standard deviation.
Common Pitfalls
⚠️ Variances Add, Not Standard Deviations: A common mistake is to add standard deviations: σX+Y=σX+σY. Instead, σX+Y=σX2, which is always less than the simple sum.
⚠️ Variance of Difference = Variance of Sum: Do not forget that Var(X−Y)=Var(X)+Var(Y), not subtraction. Negative values don't reduce variance.
⚠️ Independence Assumption: These rules assume independence. If X and Y are correlated, the formulas are invalid. Always verify or state the independence assumption.
Calculator Tip
💡 TI-84 / TI-Nspire: To work with combined variables, define mean and variance for each component, then use the rules above manually (e.g., E(2X+3Y)=2×E(X)+3×E(Y)). There is no built-in function; apply the formulas by hand and verify your arithmetic.
X
=
10
σX=4
μY
σY
💡 Show Solution
Apply transformation rules:
μY=3(10)−5=30−5=25
σY=∣3∣×4=12
Answer:μY=25 and σY=12.
2Problem 2medium
❓ Question:
Two independent random variables: X with μX=50, σX=8; and Y with μY=40, σY=6. Find the mean and standard deviation of Z=X+Y and W=X−Y.
💡 Show Solution
For Z=X+Y:
Mean:
μZ=
3Problem 3hard
❓ Question:
Three independent machines produce items with defect rates: X1∼(μ=2,σ=0.5), X2∼(μ=3,σ=0.7), X3∼(μ=1.5,σ=0.4). Total defects are T=X1+2X2+0.5X. Find E(T) and σ(T).
Are there practice problems for Combining Random Variables (Sums and Differences)?▾
Yes, this page includes 3 practice problems with detailed solutions. Each problem includes a step-by-step explanation to help you understand the approach.
+
σY2
50+
40=
90
Variance (adding):
σZ2=82+62=64+36=100
Standard deviation:
σZ=100=10
For W=X−Y:
Mean:
μW=50−40=10
Variance (variance adds for differences too):
σW2=82+62=64+36=100
Standard deviation:
σW=100=10
Answer: For both Z and W: mean is 90 and 10 respectively; standard deviation is 10 for both.
3
)
+
2E(X2)+
0.5E(X3)=
2+
2(3)+
0.5(1.5)
=2+6+0.75=8.75
Variance of weighted sum:Var(T)=(1)2(0.5)2+(2)2(0.7)2+(0.5)2(0.4)2=1(0.25)+4(0.49)+0.25(0.16)=0.25+1.96+0.04=2.25