T tests on Regression Equations

One of the advantages of the regression equations is that we can run t tests on the individual regression coefficients. t tests help us to determine whether the coefficient is significant or not for the associated independent variable. Through the t tests we can understand whether a particular independent variable has on effect on the output, holding the other independent variables constant.

We start the t tests by making an assumption that the coefficient is equal to zero. So that we can measure how much this coefficient is different from zero, regardless of the sign. (Difference can be negative or positive). If test result yields lower than the critical value given in the t-distributions table, we can come to conclusion that this particular coefficient is insignificant for the output; otherwise it is accepted as significant. Therefore our hypotheses can be written as:

H0 : Bi = 0,     H1 : Bi ≠ 0       

where H0 (null hypothesis) is for assuming Bi as zero; whereas H1 (alternative hypothesis) is for assuming Bi as different from zero. Since we don’t care the sign of the difference this is a two-tailed t-test. and t is defined as:



Once we calculate all t values for each B value in a regression equation, we find the critical t value from the table and then decide whether they fall under this critical value or not. Values fall under critical value is accepted as insignificant and others are significant.

1 comment:

Rajesh Attri said...

Hey guys, I have found an excellent site that has Financial Data charts with statistical functions to analyze data : www.thinknum.com Check it out