Google Answers Logo
View Question
 
Q: Constructing the Regression Equation to calculate the Change in R Square ( No Answer,   6 Comments )
Question  
Subject: Constructing the Regression Equation to calculate the Change in R Square
Category: Reference, Education and News > Teaching and Research
Asked by: marsbrook-ga
List Price: $100.00
Posted: 16 Nov 2003 12:08 PST
Expires: 16 Dec 2003 12:08 PST
Question ID: 276469
I am trying to create a series of multiple regression equations in
order to use a number of independent variables to predict the possible
change in a particular dependent variable, using SPSS 11.0 as the
statistical software package.

The process I have employed so far, is to introduce a series of models
one after the other as follows:

Model 1 - This first model, would have information only on five
independent control variables, and regressed against the dependent
variable. This would give a series of results, including the following
key tables:

Model Summary
					Change Statistics
Model	R	R Square	Adjusted R Square	Std. Error of the Estimate	R Square
Change	F Change	df1	df2	Sig. F Change
									

ANOVA
Model	Sum of Squares	df	Mean Square	F	Sig.
RegressionResidualTotal					

  
Model 2. The second model, (model 2) would introduce the first key
independent variable 1, immediately after the control variables in a
second regression equation with the same dependent variable used in
model 1. There would be another series of output tables including the
two shown above. This time of course they would be expected to have
different figures in each column.

Model 3. The third model, (model 3) would then introduce a second
independent key variable 2 in place of variable 1 in Model 2 and once
again we would be monitoring the change.

Model 4. The forth model, (model 4) would introduce both key
independent variables 1 & 2 and assess the change in the dependent
variable when both of these key independent variables are introduced
together.

What I am trying to determine is how to structure each model so that I
am able to calculate the change in R Square which would result from
the introduction of each new variable in succeeding models.

Clarification of Question by marsbrook-ga on 25 Nov 2003 13:43 PST
Before we close this let me pose an observation. When we calculate the
value for F and compare it with the table, the result merely tells us
if the change is significant or not. But we already know this from the
SPSS output. If you recall, In my first example, I showed that F not
significat in either model 1 or model 2. On the other hand, I also
showed that in my second example F was not significant in Model 1,
since the ANOVA table show the significance as (.350). However, when
the additional variable was added to this example F was shown as
significant with an ANOVA table reading of (.038). Why would it then
be necessary to do a manual calculation?

I think I understand the implication of identifying the value of F,
but my original question still remains unanswered. It is as follows:-
When we do the regression in SPSS, is there a way of showing a
numerical change in R Squared other than subtracting the value
indicated for R Squared shown in the regression equation for Model 2
from the corresponding value shown for R Squared in the regression
equation for Model 1. Or have I now totally confused the issue.

Clarification of Question by marsbrook-ga on 26 Nov 2003 19:04 PST
I wish to thank you so much for your assistance in this. I believe
that with your last explanation, I have finally got this problem
solved. Therefore, I would invite you to post it as your official
answer so that you may receive the funds which I posted and which you
richly deserve.

Clarification of Question by marsbrook-ga on 05 Dec 2003 08:22 PST
Just because you have been so helpful, I thought I'd share some
additional information which I just came accross. If you look at SPSS
for Windows Base System User's Guide, Release 6.0, page 343/4, you
will see that it shows that the Change in R squared for a particular
variable, is the value of R squared after you entered that variable,
less the value of R squared in the model before you enter the
variable. It may also be derived by squaring the Part Correlation in
the Coefficients table in the SPSS Output. Additionally, SPSS
calculated the Change in R squared by simply allowing you to enter
additional variables in a stepwise fashion. You merely click on
Analyse, Regression, Linear, and input the first set of independent
variables along with your dependent variable in the appropriate boxes.
You then click on Next, put in your new independent variable, click on
statistics and select R square change. Click continue and O.K. The
SPSS output table then gives you the original value of R squared and
the change in R Squared. This is what I was searching for.
Answer  
There is no answer at this time.

Comments  
Subject: Re: Constructing the Regression Equation to calculate the Change in R Square
From: czh-ga on 16 Nov 2003 17:29 PST
 
See related question:

http://answers.google.com/answers/threadview?id=275110
Q: Calculating the change in R squared in Multiple Regression
Subject: I think this is the solution, I hope this helps
From: rexdog979-ga on 20 Nov 2003 14:42 PST
 
Hey friend,
Your question seems very clear.  I assume you have a regression model
with p variables.  And then you add a few variables to this to have k
variables.  Obivously k>p.  Moreover we will call the first model the
reduced model since it has fewer terms.  The second model will be the
complete model, since it has all the variable.
For each model you should have an R^2.  For the reduced model we'll
call it Rr^2.  For the complete model we'll call it Rc^2.  N=the
sample size.  With this information we can plug it into a simple
equation, solving for F.

F= [(Rc^2-Rr^2)/(k-p)] / [(1-Rc^2)/(n-(k+1))]

We then compare that F value to 

  k-p
F
  n-(k+1)


I'm not sure if you're familiar with F-tests, but they are in the back
of most statistics text books.  If you're looking at the chart there
is a v1 that you read across and a v2 that you read down.  v1= k-p and
v2= n-(k+1).
So now you have the F that you solved and the F you found in the
textbook.  If the F that you solved is greater than the F in the
textbook, then you can be confident that at least one of the newly
introduced variables made an impact.

If you need any more help, particularly with how to read an F-table in
a book, just give k,p,n and I can do it in a few seconds.
Also if you need any more clarification on your answer, just ask.

Finally, there might be other ways to test new variables being
introduced, through various t-tests.  Anyway, hope this helps.
Subject: Re: Constructing the Regression Equation to calculate the Change in R Square
From: marsbrook-ga on 22 Nov 2003 15:40 PST
 
This looks like the solution to the problem. However, to check it out,
here are some concrete figures. The first model which you term the
reduced model there are 5 variables. I.e., p=5 and the SPSS Model
Summary gives an R squared of .021.  In the second model which you
term the complete model there are 6 variables. I.e., k=6 and the SPSS
Model Summary gives and R Squared of .044. The sample size P=99. Can
you derive a solution based on this information?

Bye the way, the SPSS Output also gives a value for F in each
equation. In Model 1, F=.398 and in Model 2, F=.715. However, the
result was not significant in either case.

Now here is another example.

The first model which you term the reduced model there are again 5
variables. I.e., p=5 and the SPSS Model Summary gives an R squared of
.056.  In the second model which you term the complete model there are
again 6 variables. I.e., k=6 and the SPSS Model Summary gives and R
Squared of .130. The sample size P=100.

The SPSS Output value for F in Model 1, F=1.130, Sig. (.350) and in
Model 2, F=2.331, sig. (.038).

Can you derive a solution based on this information?
Subject: Re: Constructing the Regression Equation to calculate the Change in R Square
From: rexdog979-ga on 23 Nov 2003 21:36 PST
 
I did the calculations for you, (mind you after a long weekend), and I
got some results:
For the first system of models, I got an F-value 2.213 (which is less
than 2.76, which would be significant at 10%).  Therefore the extra
variable is not significant.

For the second system of models, I got an F value 7.9144 which is
significant at 1%.  Ergo it is very significant.  Most of times you
compare it to significant at 10%, 5%, and 1%... I would assume only
things such as drugs testing are done at more significant levels.

Simply put, adding the variable in the first not significant.  Adding
the variable in the second is very significant.
Subject: Re: Constructing the Regression Equation to calculate the Change in R Square
From: rexdog979-ga on 25 Nov 2003 16:19 PST
 
It can be a bit confusing.

First looking at model 1 and model 2, they are both very insignificant
models.  R squared values should be more than .044 and .130.  These
values represent how much the model explains your y variable.  Usually
these values are in the .500 and higher, meaning that they explain at
least 50% of the variability.  So therefore, you should not even
proceed any further with these models because they are so poor.

Second, even if you do continue with these models, you should be able
to tell which variables are significant by the T-tests.  You will see
the t-value on the SPSS printout and next to it a p-value.  This is
the simplest way to see if the variables you are adding are
significant.

Finally, I will answer your question about observing the change in R
squared with the addition of new variables.  When looking at these
models it is best not to look at R squared because the formula that
you use for this will automatically increase with the addition of new
variables.
Instead, you should look at the adjusted R-squared.  This "adjusts"
the formula to account for the amount of variables you are using.  If
you notice, the adjusted R-squared is located next to the R-squared on
the SPSS printout.  It is through the adjusted R-squared that you
simply subtract the two models to see the impact of the new variables.

(As a final aside, the F test I showed you is a useful tool when
adding more than one variable to a regression model.  It is a much
different F-test than the one you are reading from the SPSS printout. 
However the F-test I did is not necessarry when you are adding only
one variable to a model; instead it would suffice to simply look at
the p-value of the t-score for that new variable.  Again, I hope this
helps.  I find regression to be quite interesting, so if you need more
help on your project or any other projects feel free to ask for help)
Subject: Re: Constructing the Regression Equation to calculate the Change in R Square
From: rexdog979-ga on 26 Nov 2003 22:27 PST
 
I'm not actually able to answer questions, since I'm not working for
google.  I'm just a college student who found an answer to a question
for my senior thesis on this service.  It was a pleasure helping you
and putting my statistics major to use.  I find regression to be
interesting and am glad to help.  I'll check back every now and then,
just in case you begin any other studies that I can be of assistance
with.  Have a good weekend.

Important Disclaimer: Answers and comments provided on Google Answers are general information, and are not intended to substitute for informed professional medical, psychiatric, psychological, tax, legal, investment, accounting, or other professional advice. Google does not endorse, and expressly disclaims liability for any product, manufacturer, distributor, service or service provider mentioned or any opinion expressed in answers or comments. Please read carefully the Google Answers Terms of Service.

If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you.
Search Google Answers for
Google Answers  


Google Home - Answers FAQ - Terms of Service - Privacy Policy