**Multiple Regression Analysis**

**Concept of Multiple Regression Analysis:**

In simple regression analysis, the regression model is fit between one dependent and one independent variable to estimate the dependent variable or to measure the effectiveness of an independent variable on the dependent variable. When the regression model is fit between one dependent and more the one independent variable then this regression model is called the multiple regression model. In multiple regression analysis, the effect of the different independent variables on the dependent variable can be measured simultaneously.

Let ‘y’ be the dependent variable and x_{1}, x_{2, }x_{3} …………… x_{k }be the ‘k’ independent variables. Then the multiple regression model is defined as

Where

y = dependent variable.

x_{1}, x_{2, }x_{3} …………… x_{k} are independent variables.

_{β0} = y-intercept.

_{β1} = Slope of y with variable x_{1} holding the remaining variables x_{2, }x_{3} …,x_{k} constant or Regression coefficient of y on x_{1} holding the remaining variables x_{2, }x_{3} …………… x_{k }constant.

_{β2} = Slope of y with variable x_{2} holding the remaining variables x_{1, }x_{3} …,x_{k} constant or Regression coefficient of y on x_{2} holding the remaining variables x_{1, }x_{3} …………… x_{k }constant.

_{β3} = Slope of y with variable x_{3} holding the remaining variables x_{1, }x_{2, }x_{4, } …,x_{k} constant or Regression coefficient of y on x_{3} holding the remaining variables x_{1, }x_{2, }x_{4, } …,x_{k} constant.

And so on. Similarly,

_{βk} = Slope of y with variable x_{k} holding the remaining variables x_{1, }x_{2, }x_{3} …,x_{k-1} constant or Regression coefficient of y on x_{k} holding the remaining variables x_{1, }x_{2, }x_{3} …,x_{k-1} constant.

e = error term or residual.

**Multiple Regression Model with Two Independent Variables:**

Let ‘y’ be the dependent variable and x_{1} and x_{2} are two independent variables. The multiple regression model with two independent variables is defined as,

Where

y = dependent variable.

x_{1}and x_{2} are independent variables.

_{β0} = y-intercept.

_{β1} = Slope of y with variable x_{1} holding the remaining variable x_{2 }constant or Regression coefficient of y on x_{1} holding the remaining variable x_{2 }constant.

_{β2} = Slope of y with variable x_{2} holding the remaining variable x_{1 }constant or Regression coefficient of y on x_{2} holding the remaining variable x_{1} constant.

e = error term or residual.

To fit the regression model (2), we have to estimate the value of _{β}_{0}, _{β}_{1,} and _{β}_{2}. To estimate the value of these parameters we use the principle of lest square. The following three normal equations of equation (2) can be obtained using the method of least square.

Now normal equations of equation (2) are

On solving these above three normal equations we can estimate the value of _{β}_{0}, _{β}_{1,} and _{β}_{2}. Hence the fitted multiple regression model is

Where,

The estimated value of the dependent variable for a given value of the independent variables.

_{β}0 = y-intercept (or Estimated value of _{β}0.)

b1 = Regression coefficients of y on x1 holding the effect of x2 constant (or Estimated

value of _{β}1.)

b2 = Regression coefficients of y on x2 holding the effect of x1 constant (or Estimated

value of _{β}2.)

## Interpreting the multiple regression coefficients:

Suppose we have the following multiple regression model

- The y-intercept
_{β}_{0}represents the average of the dependent variable when the value of independents variables are zero i.e. x_{1}= x_{2}=0. for example

Here _{β}_{0} = 20, this means, the average value of dependent variable is 20 when x_{1} = x_{2} =0

2. The multiple regression coefficients b_{1} measure the average rate of increased or decreased in the value of a dependent variable (y) while increasing the value of independent variable ‘x_{1}’ by unit, by keeping the effect of another independent variable ‘x_{2}’ constant. For example,

Here, _{β}_{1} = 2.5, this means, the value of a dependent variable (y) is increased by 2.5 when the value of an independent variable (x_{1}) is increased by 1, by keeping the effect of x_{2} constant.

3. The multiple regression coefficients _{β}_{2} measure the average rate of increased or decreased in the value of a dependent variable (y) while increasing the value of independent variable ‘x_{2}’ by unit, by keeping the effect of other independent variables ‘x_{1}’ constant. For example,

Here, _{β}_{2} = -3.8, this means, the value of the dependent variable (y) is decreased by 3.8 when the value of an independent variable (x_{2}) is increased by 1, by keeping the effect of x_{1} constant.

## Error term or Residual:

The difference between the observed and estimated value of the dependent variable (y) is called error or residual and it is denoted by ‘e’

Where

e = Error term

y= Observed value of the dependent variable.

= Estimated value of the dependent variable for a given value of a set of independent variables.

You may also like: Confidence Interval Estimate

## Leave a Reply