Skip to content
Pablo Rodriguez

Decision Boundary

Making Predictions with Logistic Regression

Section titled “Making Predictions with Logistic Regression”

While logistic regression outputs probabilities (like 0.7 or 0.3), we often need discrete predictions (0 or 1). A common approach uses a threshold:

  • If f(x) ≥ 0.5, predict ŷ = 1
  • If f(x) < 0.5, predict ŷ = 0
  1. f(x) ≥ 0.5 when g(z) ≥ 0.5
  2. g(z) ≥ 0.5 when z ≥ 0 (since sigmoid equals 0.5 when z = 0)
  3. z ≥ 0 when w·x + b ≥ 0

Therefore: The model predicts 1 whenever w·x + b ≥ 0

Consider a classification problem with two features (x₁, x₂) where:

  • Red crosses = positive examples (y = 1)
  • Blue circles = negative examples (y = 0)
  • Parameters: w₁ = 1, w₂ = 1, b = -3

The decision boundary occurs when w·x + b = 0:

  • x₁ + x₂ - 3 = 0
  • x₁ + x₂ = 3

This creates a straight line where:

  • Points to the right predict y = 1
  • Points to the left predict y = 0

Polynomial Features Enable Complex Boundaries

Section titled “Polynomial Features Enable Complex Boundaries”

Using polynomial features like x₁², x₂², we can create non-linear decision boundaries.

Example: If z = w₁x₁² + w₂x₂² + b with w₁ = 1, w₂ = 1, b = -1

The decision boundary becomes:

  • x₁² + x₂² - 1 = 0
  • x₁² + x₂² = 1

This creates a circular decision boundary where:

  • Inside the circle: predict y = 0
  • Outside the circle: predict y = 1

With polynomial features, logistic regression can:

  • Fit very complex data patterns
  • Create decision boundaries of almost any shape
  • Handle non-linearly separable data

Linear Features Only

Using only x₁, x₂, x₃, etc. produces straight-line decision boundaries

Polynomial Features

Adding x₁², x₁x₂, x₂², etc. enables curved decision boundaries of any complexity

The decision boundary is the line (or curve) where w·x + b = 0. While linear features create straight decision boundaries, polynomial features enable logistic regression to learn complex, curved boundaries that can separate non-linearly separable data. This flexibility makes logistic regression suitable for a wide range of classification problems.