Discrete Distributions

Back

Loading concept...

🎲 Random Variables: Discrete Distributions

The Story of Counting Special Moments

Imagine you’re playing a game where you flip coins, roll dice, or count how many stars fall from the sky. These aren’t just random events—they follow secret patterns! Today, we’ll discover these magical patterns called Discrete Distributions.

Think of discrete distributions like different types of cookie jars. Each jar has its own rules for how cookies are arranged inside. Some jars are simple (yes or no cookies), while others are more complex (counting cookies over time). Let’s open each jar!


🪙 Bernoulli Trials: The Yes-or-No Game

What Is It?

A Bernoulli Trial is the simplest game ever—it only has TWO outcomes:

  • Success (what you’re hoping for) ✅
  • Failure (everything else) ❌

That’s it! Just like flipping a coin: Heads or Tails. Nothing in between.

The Cookie Jar Analogy 🍪

Imagine a jar with only TWO types of cookies:

  • Chocolate chip (Success!)
  • Plain (Failure)

Every time you reach in, you get ONE or the OTHER. Never both, never neither.

Real Examples

Situation Success Failure
Flip a coin Heads Tails
Shoot a basketball Score! Miss
Ask a yes/no question Yes No
Light switch On Off

The Magic Formula

If p = probability of success, then:

  • P(Success) = p
  • P(Failure) = 1 - p

Example: If you flip a fair coin:

  • P(Heads) = 0.5
  • P(Tails) = 0.5

Key Facts

  • Only ONE trial at a time
  • Only TWO possible outcomes
  • Outcomes are independent (one flip doesn’t affect the next)

📊 Binomial Distribution: Counting Successes

What Is It?

What happens when you play the Bernoulli game many times?

The Binomial Distribution counts HOW MANY successes you get in a fixed number of tries.

The Cookie Jar Analogy 🍪

Now imagine reaching into the yes/no cookie jar 10 times (putting each cookie back after you look). How many chocolate chips will you get? Maybe 3? Maybe 7? The binomial distribution tells us the chances!

Three Magic Ingredients

  1. n = number of trials (how many times you try)
  2. p = probability of success each time
  3. k = number of successes you want to count

Real Example

Situation: You flip a coin 5 times. What are the chances of getting exactly 3 heads?

  • n = 5 (five flips)
  • p = 0.5 (50% chance of heads)
  • k = 3 (we want 3 heads)

The binomial distribution answers this!

Key Rules

✅ Fixed number of trials (n) ✅ Each trial is independent ✅ Same probability (p) every time ✅ Only two outcomes per trial


🧮 Binomial Probability Formula

The Magic Spell

P(X = k) = C(n,k) × p^k × (1-p)^(n-k)

Let’s break this down like building blocks:

Part 1: C(n,k) - “Choose”

This counts HOW MANY WAYS you can arrange k successes in n trials.

C(n,k) = n! / (k! × (n-k)!)

Example: C(5,3) = “How many ways to get 3 heads in 5 flips?”

C(5,3) = 5! / (3! × 2!)
       = 120 / (6 × 2)
       = 10 ways

Part 2: p^k

Probability of getting k successes.

Example: (0.5)³ = 0.125

Part 3: (1-p)^(n-k)

Probability of getting the remaining failures.

Example: (0.5)² = 0.25

Putting It Together

P(3 heads in 5 flips) = 10 × 0.125 × 0.25 = 0.3125 or 31.25%

Visual Flow

graph TD A["Start: n trials, want k successes"] --> B["Count arrangements: C n,k"] B --> C["Multiply by success prob: p^k"] C --> D["Multiply by failure prob: 1-p^n-k"] D --> E["Final Probability!"]

📈 Binomial Mean and Variance

The Average (Mean): What to Expect

If you repeat the experiment forever, on average you’d get:

Mean (μ) = n × p

Example: Flip a coin 100 times

  • μ = 100 × 0.5 = 50 heads (on average)

The Spread (Variance): How Much Wiggle Room?

How much do results spread out from the average?

Variance (σ²) = n × p × (1 - p)

Example: Same 100 coin flips

  • σ² = 100 × 0.5 × 0.5 = 25

Standard Deviation

The square root of variance tells us typical spread:

σ = √(n × p × (1 - p))
  • σ = √25 = 5 heads

This means: In 100 flips, you’ll usually get between 45-55 heads!

Quick Reference Table

Measure Formula Coin Example (n=100, p=0.5)
Mean n × p 50
Variance n × p × (1-p) 25
Std Dev √variance 5

⏰ Geometric Distribution: Waiting for Success

What Is It?

The Geometric Distribution answers: “How many tries until my FIRST success?”

It’s like asking: “How many times do I need to flip until I get my first heads?”

The Cookie Jar Analogy 🍪

Imagine you REALLY want a chocolate chip cookie. You keep reaching into the jar until you finally grab one. How many tries did it take? That’s geometric!

The Formula

P(X = k) = (1-p)^(k-1) × p

Where:

  • k = the trial number when you first succeed
  • p = probability of success

Example: First Heads

Question: What’s the probability of getting your first heads on the 3rd flip?

  • p = 0.5
  • k = 3 (success on third try)
P(X = 3) = (0.5)² × (0.5)
         = 0.25 × 0.5
         = 0.125 or 12.5%

This means: Fail, Fail, Success! (T, T, H)

Mean: Average Waiting Time

On average, how long until success?

Mean = 1/p

Example: For a fair coin (p = 0.5):

  • Average flips until first heads = 1/0.5 = 2 flips

Variance

Variance = (1-p) / p²

Key Insight

The geometric distribution has no memory! If you’ve failed 100 times, your chance of success on try 101 is still just p. Past failures don’t help you!


☄️ Poisson Distribution: Counting Rare Events

What Is It?

The Poisson Distribution counts events that happen randomly over TIME or SPACE.

It’s perfect for:

  • How many texts you receive per hour
  • How many cars pass by per minute
  • How many typos per page

The Cookie Jar Analogy 🍪

Imagine cookies randomly fall from the sky into a field. The Poisson distribution tells you: “How many cookies will land in my backyard today?”

The Magic Number: λ (Lambda)

λ = the average number of events in your time/space window

Example: If you get an average of 3 texts per hour, then λ = 3

The Formula

P(X = k) = (e^(-λ) × λ^k) / k!

Where:

  • k = exact number of events you want
  • λ = average rate
  • e ≈ 2.718 (a special math constant)

Example: Text Messages

Situation: You get an average of 4 texts per hour (λ = 4). What’s the probability of getting exactly 2 texts in an hour?

P(X = 2) = (e^(-4) × 4²) / 2!
         = (0.0183 × 16) / 2
         = 0.293 / 2
         = 0.1465 or about 14.7%

Mean and Variance

Here’s something magical about Poisson:

Mean = λ
Variance = λ

They’re the SAME! When you hear “Poisson,” think “mean equals variance.”

When to Use Poisson?

graph TD A["Random events?"] -->|Yes| B["Over time or space?"] B -->|Yes| C["Events independent?"] C -->|Yes| D["Know the average rate?"] D -->|Yes| E["Use Poisson! λ = average"] A -->|No| F["Try another distribution"] B -->|No| F C -->|No| F D -->|No| F

🎯 Choosing the Right Distribution

Question You’re Asking Distribution Key Clue
Yes or no, one time? Bernoulli Single trial, 2 outcomes
How many successes in n tries? Binomial Fixed trials, counting wins
Which try brings first success? Geometric Waiting for first win
How many events in a time period? Poisson Rate over time/space

🌟 Summary: Your Distribution Toolkit

Bernoulli

  • One trial, two outcomes
  • P(success) = p

Binomial

  • n trials, count successes
  • Need: n, p, k
  • Mean = np, Variance = np(1-p)

Geometric

  • Trials until first success
  • Mean = 1/p
  • “No memory” property

Poisson

  • Count events in time/space
  • Only need λ (average rate)
  • Mean = Variance = λ

🎪 Final Thought

These distributions are like different lenses for seeing randomness. Once you know which lens to use, the random world becomes predictable!

  • Flipping coins? Binomial
  • Waiting for something? Geometric
  • Counting rare events? Poisson

You now have the power to predict the unpredictable! 🚀

Loading story...

Story - Premium Content

Please sign in to view this story and start learning.

Upgrade to Premium to unlock full access to all stories.

Stay Tuned!

Story is coming soon.

Story Preview

Story - Premium Content

Please sign in to view this concept and start learning.

Upgrade to Premium to unlock full access to all content.