Skip to content
Pablo Rodriguez

Multiple Features

Introduction to Multiple Linear Regression

Section titled “Introduction to Multiple Linear Regression”

Linear regression can be enhanced by using multiple features instead of just one. Rather than predicting house prices using only size, we can include additional features like:

  • Number of bedrooms
  • Number of floors
  • Age of the home in years

This provides much more information for making accurate predictions.

  • X_j: Represents individual features (j goes from 1 to n)
  • n: Total number of features
  • X^(i): The ith training example (a vector containing all features)
  • X^(i)_j: The value of feature j in the ith training example
  • X^(2) = [1416, 3, 2, 40] (all features for second training example)
  • X^(2)_3 = 2 (third feature - number of floors - in second example)

f_{w,b}(x) = w_1x_1 + w_2x_2 + w_3x_3 + w_4x_4 + b

For house price prediction: f_{w,b}(x) = 0.1x_1 + 4x_2 + 10x_3 - 2x_4 + 80

Where:

  • 0.1: Each square foot adds $100 to price
  • 4: Each bedroom adds $4,000 to price
  • 10: Each floor adds $10,000 to price
  • -2: Each year of age decreases price by $2,000
  • 80: Base price of $80,000
  • w: Vector containing all weights [w_1, w_2, w_3, …, w_n]
  • x: Vector containing all features [x_1, x_2, x_3, …, x_n]
  • b: Single number (bias term)

f_{w,b}(x) = w⃗ · x⃗ + b

The dot product w⃗ · x⃗ equals: w_1x_1 + w_2x_2 + w_3x_3 + … + w_nx_n

Multiple Linear Regression

Multiple Linear Regression: Linear regression with multiple input features, as opposed to univariate regression which uses only one feature. Note that “multivariate regression” refers to a different concept entirely.

Multiple linear regression provides a more powerful and flexible approach to making predictions by incorporating additional relevant information through multiple features. The vector notation and dot product formulation creates a compact, mathematically elegant representation of the model.