Greetings, In the previous blogs, We have discussed linear regression in detail. In this blogs we will discuss:
- Multiple Linear regression
- Polynomial regression.
Multiple linear regression :
Up till now, we have seen that we have one X i.e feature, and one y i.e one label to predict. Building a linear regression model with one feature is not feasible in real life. Let's consider the house price prediction scenario.
| Number of rooms (X1) | Security out of 10 (X2) | Floor number (X3) | Rating out of 10 (X4) | Price (Y) |
| 2 | 7 | 15 | 9 | 100000 |
| 3 | 9 | 6 | 7 | 200000 |
| 2 | 6 | 2 | 8 | 450000 |
| 1 | 10 | 8 | 9 | 550000 |
As you can see there are many features and those features are providing important information. If we train a model with a single feature then that model will not be able to provide good results. Hence we have to include all the features to predict the price of the house i.e y. To solve this type of problem, Multiple linear regression is used. The equation is similar to the normal linear regression.
Simple linear regression :
Linear equation is :
`y = W1 \times X + W0` (where W is weights)
`y = W0 + W1 \times X`
Gradient descent equation is :
While(converge)
{
Wnew `= Wold - \eta\frac{\partial Loss}{\partial Wold}`
}
Multiple linear regression :
Linear equation is :
`y=W0+(W1 \times X1)+(W2 \times X2) + ... + (Wn \times Xn)`
where,
W is weights
n is the total number of features
solving the multiple linear equations.
`y = W0 + (W1 \times X1) + (W2 \times X2) + ... + (Wn \times Xn)`
`y = (W0 \times 1) + (W1 \times X1) + (W2 \times X2) + ... + (Wn \times Xn)`
consider X0 = 1
`y=(W0 \times X0)+(W1 \times X1)+... +(Wn \times Xn)`
$$ y = \begin{bmatrix}X0 & X1 & X2 & X3 & ... & Xn \end{bmatrix} \times \begin{bmatrix}W0 \\ W1 \\ W2 \\ W3 \\ ... \\ Wn \end{bmatrix}$$
$$ y = \begin{bmatrix}X0 & X1 & ... & Xn \end{bmatrix} \times \begin{bmatrix}W0 & W1 & ... & Wn \end{bmatrix}^\mathsf{T}$$
`y = W^\mathsf{T} \times X`
Where,
W and X are now Vectors.
now, there are 2 types of vector products.
- Dot product.
- Cross product.
Hence, After applying the proper vector product to the equation i.e (dot product). The main equation for multiple linear regression will be :
`y = W^\mathsf{T} \cdot X`.
Where,
W is the weight vector
X is the X vector.
Gradient descent equation is :
While(converge)
{
W1 new `= W1 old - \eta\frac{\partial Loss}{\partial W1 old}`
W2 new `= W2 old - \eta\frac{\partial Loss}{\partial W2 old}`
...
Wn new `= Wn old - \eta\frac{\partial Loss}{\partial Wn old}`
}
Polynomial Regression :
In every case, Your data pattern between X and y will not be linear. Sometimes the data can also show exponential relation. With reference to Fig 1. The exponential function is more efficiently fitting the data. Thus, The main equation of polynomial equation will be :
`y = W0 \times X + W1 \times X^2 + ... +Wn \times X^n `
where,
y is the predicted value
W0, W1, ...Wn: weights of X
X is a feature.
If you focus on X then you will observe that in the same X we are applying multiple weights with different power.
This polynomial equation is for a single feature i.e X. The gradient descent equation will be :
While(converge)
{
W1 new `= W1 old - \eta\frac{\partial Loss}{\partial W1 old}`
W2 new `= W2 old - \eta\frac{\partial Loss}{\partial W2 old}`
...
Wn new `= Wn old - \eta\frac{\partial Loss}{\partial Wn old}`
}
This polynomial regression equation is for only one feature. What if there are multiple features like multiple linear regression. polynomial regression with multiple features is known as multiple polynomial regression. The equation will be similar and combination of multiple linear regression and polynomial regression equation:
`y = W00 \times X0 + W01 \times X0^2 + ... +W0n \times X0^n `
`+ W10 \times X1 + W11 \times X1^2 + ... +W1n \times X1^n `
`+ W20 \times X2 + W21 \times X2^2 + ... +W2n \times X2^n `
.....
`+ Wm0 \times Xm + Wm1 \times Xm^2 + ... +Wmn \times Xm^n `.
Summary :
- multiple linear regression equation `y = W^\mathsf{T} \times X`
- Polynomial linear regression equation `y = W0 \times X + W1 \times X^2 + ... +Wn \times X^n `
- multiple polynomial linear regression equation is
`y = W00 \times X0 + W01 \times X0^2 + ... +W0n \times X0^n `
`+ W10 \times X1 + W11 \times X1^2 + ... +W1n \times X1^n `
`+ W20 \times X2 + W21 \times X2^2 + ... +W2n \times X2^n `
.....
`+ Wm0 \times Xm + Wm1 \times Xm^2 + ... +Wmn \times Xm^n `.
-Santosh Saxena

