Cap 16 Construccion de Modelos
Cap 16 Construccion de Modelos
Cap 16 Construccion de Modelos
MULTIPLE CHOICE
1. A model in the form of y = 0 + 1z1 + 2z2 + . . . +pzp + where each independent variable zj (for j =
1, 2, . . ., p) is a function of xj . xj is known as the
a. general linear model
b. general curvilinear model
c. multiplicative model
d. multiplicative curvilinear model
ANS: A
PTS: 1
PTS: 1
PTS: 1
is referred to as a
a. curvilinear model
b. curvilinear model with one predictor variable
c. simple second-order model with one predictor variable
d. simple first-order model with one predictor variable
ANS: D
PTS: 1
5. In multiple regression analysis, the word linear in the term "general linear model" refers to the fact
that
a. , , . . . p, all have exponents of 0
b. , , . . . p, all have exponents of 1
c. , , . . . p, all have exponents of at least 1
d. , , . . . p, all have exponents of less than 1
ANS: B
6. Serial correlation is
PTS: 1
a.
b.
c.
d.
ANS: B
PTS: 1
PTS: 1
PTS: 1
9. Which of the following tests is used to determine whether an additional variable makes a significant
contribution to a multiple regression model?
a. a t test
b. a Z test
c. an F test
d. a chi-square test
ANS: C
PTS: 1
10. A variable such as Z, whose value is Z = X1X2 is added to a general linear model in order to account
for potential effects of two variables X1 and X2 acting together. This type of effect is
a. impossible to occur
b. called interaction
c. called multicollinearity effect
d. called transformation effect
ANS: B
PTS: 1
is known as
a. first-order model with one predictor variable
b. second-order model with two predictor variables
c. second-order model with one predictor variable
d. None of these alternatives is correct.
ANS: C
PTS: 1
b. larger than 1
c. larger than 2
d. larger than 3
ANS: B
PTS: 1
PTS: 1
PTS: 1
15. The correlation in error terms that arises when the error terms at successive points in time are related is
termed
a. leverage
b. multicorrelation
c. autocorrelation
d. parallel correlation
ANS: C
PTS: 1
PTS: 1
17. When dealing with the problem of non-constant variance, the reciprocal transformation means using
a. 1/X as the independent variable instead of X
b. X2 as the independent variable instead of X
c. Y2 as the dependent variable instead of Y
d. 1/Y as the dependent variable instead of Y
ANS: D
PTS: 1
Also, the following standard errors and the sum of squares were obtained.
Sb1 = 3 Sb2 = 6
SST = 4,800
Sb3 = 7
SSE = 1,296
NARREND
18. Refer to Exhibit 16-1. If you want to determine whether or not the coefficients of the independent
variables are significant, the critical value of t statistic at = 0.05 is
a. 2.080
b. 2.060
c. 2.064
d. 1.96
ANS: A
PTS: 1
PTS: 1
PTS: 1
PTS: 1
PTS: 1
23. Refer to Exhibit 16-1. If we are interested in testing for the significance of the relationship among the
variables (i.e., significance of the model) the critical value of F at = 0.05 is
a. 2.76
b. 2.78
c. 3.10
d. 3.07
ANS: D
PTS: 1
24. Refer to Exhibit 16-1. The test statistic for testing the significance of the model is
a. 0.730
b. 18.926
c. 3.703
d. 1.369
ANS: B
PTS: 1
25. Refer to Exhibit 16-1. The p-value for testing the significance of the regression model is
a. less than 0.01
b. between 0.01 and 0.025
c. between 0.025 and 0.05
d. between 0.05 and 0.1
ANS: A
PTS: 1
PTS: 1
27. Refer to Exhibit 16-2. The degrees of freedom associated with SSR are
a. 24
b. 6
c. 19
d. 5
ANS: D
PTS: 1
28. Refer to Exhibit 16-2. The degrees of freedom associated with SSE are
a. 24
b. 6
c. 19
d. 5
ANS: A
PTS: 1
29. Refer to Exhibit 16-2. The degrees of freedom associated with SST are
a. 24
b. 6
c. 19
PTS: 1
PTS: 1
PTS: 1
32. Refer to Exhibit 16-2. The test statistic F for testing the significance of the above model is
a. 32.12
b. 6.69
c. 4.8
d. 58
ANS: A
PTS: 1
33. Refer to Exhibit 16-2. The p-value for testing the significance of the regression model is
a. less than 0.01
b. between 0.01 and 0.025
c. between 0.025 and 0.05
d. between 0.05 and 0.1
ANS: A
PTS: 1
34. Refer to Exhibit 16-2. The coefficient of determination for this model is
a. 0.6923
b. 0.1494
c. 0.1300
d. 0.8700
ANS: D
PTS: 1
Coefficient
145
20
-18
4
Standard Error
29
5
6
4
PTS: 1
36. Refer to Exhibit 16-3. We want to test whether the parameter 2 is significant. The test statistic equals
a. 4
b. 5
c. 3
d. -3
ANS: D
PTS: 1
37. Refer to Exhibit 16-3. The critical t value obtained from the table to test an individual parameter at the
5% level is
a. 2.06
b. 2.069
c. 2.074
d. 2.080
ANS: D
PTS: 1
PTS: 1
39. Refer to Exhibit 16-4. The life expectancy of a rat that was given 3 units of protein daily, and who took
agent X2 is
a. 36.7
b. 36
c. 49
d. 38.4
ANS: A
PTS: 1
40. Refer to Exhibit 16-4. The life expectancy of a rat that was not given any protein and that did not take
agent X2 is
a. 36.7
b. 34.3
c. 36
d. 38.4
ANS: C
PTS: 1
41. Refer to Exhibit 16-4. The life expectancy of a rat that was given 2 units of agent X 2 daily, but was not
given any protein is
a. 32.6
b. 36
c. 38
d. 34.3
ANS: D
PTS: 1
42. Refer to Exhibit 16-4. The degrees of freedom associated with SSR are
a. 2
b. 33
c. 32
d. 30
ANS: A
PTS: 1
43. Refer to Exhibit 16-4. The degrees of freedom associated with SSE are
a. 3
b. 33
c. 32
d. 30
ANS: D
PTS: 1
PTS: 1
45. Refer to Exhibit 16-4. If we want to test for the significance of the model, the critical value of F at
95% confidence is
a. 4.17
b. 3.32
c. 2.92
d. 1.96
ANS: B
PTS: 1
46. Refer to Exhibit 16-4. The test statistic for testing the significance of the model is
a. 0.50
b. 5.00
c. 0.25
d. 0.33
ANS: B
PTS: 1
47. Refer to Exhibit 16-4. The p-value for testing the significance of the regression model is
a. less than 0.01
b. between 0.01 and 0.025
c. between 0.025 and 0.05
d. between 0.05 and 0.10
ANS: B
PTS: 1
PTS: 1
PROBLEM
1. In a regression analysis involving 21 observations and 4 independent variables, the following
information was obtained.
= 0.80
S = 5.0
Based on the above information, fill in all the blanks in the following ANOVA.
Hint:
, but also
Source
= 1-
DF
SS
MS
Regression
_____?
_____?
_____?
_____?
Error (Residual)
_____?
_____?
_____?
Total
_____?
_____?
ANS:
Source of Variation
DF
SS
MS
Regression
1,600
400
16
Error (Residual)
16
400
25
Total
20
2,000
PTS: 1
2. We are interested in determining what type of model best describes the relationship between two
variables x and y.
a.
For a given data set, an estimated regression equation relating x and y of the form
was developed, using Excel. The results are shown below. Comment on the
adequacy of this equation for predicting y. Let = .05.
SUMMARY OUTPUT
Regression Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.5095
0.2596
0.1362
2.0745
8
ANOVA
Regression
Residual
Total
Intercept
x
b.
df
1
6
7
SS
9.0536
25.8214
34.875
MS
9.0536
4.3036
F
2.1037
Coefficients
2.7857
0.4643
Standard Error
1.6164
0.3201
t Stat
1.7234
1.4504
P-value
0.1356
0.1971
Significance F
0.1971
An estimated regression equation for the same data set (as in part a) of the form
was developed. The Excel output is shown below. Comment on the
adequacy of this equation for predicting y. Let = .05.
SUMMARY
OUTPUT
Regression
Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.9680
0.9370
0.9118
0.6628
8
ANOVA
Regression
Residual
Total
Intercept
x
df
2
5
7
SS
32.6786
2.1964
34.875
MS
16.3392
0.4393
F
37.1951
Coefficients
-2.8393
3.8393
Standard Error
0.9247
0.4714
t Stat
-3.0706
8.1437
P-value
0.0278
0.0005
Significance F
0.0010
x-squared
c.
-0.375
0.0511
-7.3335
0.0007
ANS:
a.
= 2.7857 + 0.4643 x, r2 = 0.2596 Only 25.96% of variation is explained.
P-value = 0.1971; no significant relationship exists. The model is not adequate for predicting
y.
b.
= -2.8392 + 3.8392 x - 0.375 x2
2
r = 93.7%, which means 93.7% of variation in y is explained by both x and x 2. Both x and x2
are significant. (Both p-values 0.05.) The p-value for the analysis of variance is 0.002,
which is less than 0.05. Therefore, the model is adequate for predicting y.
c. 6.517
PTS: 1
y
1
6
9
7
4
3
SUMMARY OUTPUT
Regression Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.3052
0.0932
-0.1335
3.0857
6
ANOVA
Regression
Residual
Total
Intercept
x
b.
df
1
4
5
SS
3.9130
38.0870
42
MS
3.9130
9.5217
F
0.4110
Coefficients
3.3043
0.2609
Standard Error
2.9297
0.4069
t Stat
1.1279
0.6411
P-value
0.3224
0.5564
Significance F
0.5564
results are shown below. Comment on the adequacy of this equation for predicting y. Let
= .05.
SUMMARY OUTPUT
Regression Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.9508
0.9041
0.8401
1.1588
6
ANOVA
Regression
Residual
Total
df
2
3
5
SS
37.9713
4.0287
42
MS
18.9856
1.343
F
14.1376
Intercept
x
x-squared
Coefficients
-2.6808
3.6803
-0.3133
Standard Error
1.6196
0.6960
0.0622
t Stat
-1.655
5.2879
-5.036
P-value
0.1964
0.0132
0.0151
c.
Significance F
0.0297
ANS:
a. Linear
b.
= 3.3043 + 0.2618 x
r2 = 0.09317
Only 9.317% of variation is explained.
P-value = 0.5563, no significant relationship exists.
The model is not adequate for predicting y.
Curvilinear
c.
PTS: 1
4. The following estimated regression equation has been developed for the relationship between y, the
dependent variable, and x, the independent variable.
The sample size for this regression model was 23, and SSR = 600 and SSE = 400.
a.
b.
ANS:
a. r2 = 0.60
b. F = 15 > 3.49; reject Ho, the relationship is significant.
PTS: 1
5. A data set consisting of 7 observations of a dependent variable y and two independent variables x1 and
x2 was used in a regression analysis. Using (x1) as the only independent variable, the following
function is provided.
= 0.408 + 1.338x1
The SSE for the above model is 39.535.
Using both x1 and x2 as independent variables yields the following function.
= 0.805 + 0.498x1 - 0.477x2
The SSE for this latter function is 1.015.
Use an F test and determine if x2 contributes significantly to the model. Let = 0.05.
ANS:
F = 151.8 and F.05 = 7.71, since 151.8 > 7.71, reject Ho and conclude x2 contributes significantly to the
model. P-value < 0.005 (almost zero).
PTS: 1
6. Monthly total production costs and the number of units produced at a local company over a period of
10 months are shown below.
Month
1
2
3
4
5
6
7
8
9
10
a.
b.
best describes the relationship between X and Y. Estimate the parameters of this curvilinear
regression equation.
ANS:
a.
b.
b0 = -0.496
PTS: 1
b1 = 0.10116
TOP: Regression - Model Building
Xi
1
4
6
7
8
Draw a scatter diagram. Does the relationship between X and Y appear to be linear?
Assume the relationship between X and Y can best be given by
b.
b0 = 1.253
PTS: 1
b1 = 0.131
TOP: Regression - Model Building
8. Part of an Excel output relating Y (dependent variable) and 4 independent variables, X1 through X4, is
shown below.
Summary Output
Regression Statistics
Multiple R
?
R Square
?
Adjusted R Square
?
Standard Error
72.6093
Observations
20
ANOVA
Regression
Residual
Total
Intercept
X1
X2
X3
X4
a.
b.
df
?
?
?
SS
422975.2376
?
?
MS
?
?
F
?
Coefficients
-203.6125
0.6483
0.0190
40.4577
-0.1032
Standard Error
100.2940
0.1110
0.0065
7.5940
20.7823
t Stat
?
?
?
?
?
P-value
0.0605
0.0000
0.0101
0.0001
0.9961
Significance F
0.0000
ANS:
a.
Summary Output
Regression Statistics
Multiple R
0.9179
R Square
0.8425
Adjusted R Square
0.8005
Standard Error
72.6093
Observations
20
ANOVA
Regression
Residual
Total
Intercept
df
4
15
19
SS
422975.2376
79081.7624
502057.0000
Coefficients
-203.6125
Standard Error
100.2940
MS
F
Significance F
105743.8094 20.0572
0.0000
5272.1175
t Stat
-2.0302
P-value
0.0605
X1
0.6483
0.1110
5.8386
0.0000
X2
0.0190
0.0065
2.9437
0.0101
X3
40.4577
7.5940
5.3276
0.0001
X4
-0.1032
20.7823
-0.0050
0.9961
b. X1 through X3 are significant, because their p-values are less than 0.05. X4 is not significant
(p-value = 0.9961>0.05).
PTS: 1
9. In a regression analysis involving 20 observations and five independent variables, the following
information was obtained.
Source of
Variation
Regression
ANALYSIS OF VARIANCE
Degrees
Sum of
Mean
of Freedom
Squares
Squares
_____?
_____?
_____?
F
_____?
Error (Residual)
_____?
Total
_____?
30
990
Degrees
of Freedom
5
Sum of
Squares
570
Mean
Squares
114
Error (Residual)
14
420
30
Total
19
990
F
3.8
PTS: 1
10. A researcher is trying to decide whether or not to add another variable to his model. He has estimated
the following model from a sample of 28 observations.
SSE = 1,425
SSR = 1,326
He has also estimated the model with an additional variable X 3. The results are
SSE = 1,300
SSR = 1,451
What advice would you give this researcher? Use a .05 level of significance.
ANS:
F = 2.308; p-value is between .05 and 0.1; do not reject H0; do not include X3 (critical F = 4.26)
PTS: 1
11. We want to test whether or not the addition of 3 variables to a model will be statistically significant.
You are given the following information based on a sample of 25 observations.
SSE = 725
SSR = 526
The equation was also estimated including the 3 variables. The results are
SSE = 520
a.
b.
SSR = 731
ANS:
a.
b.
H0: = = = 0
Ha: at least one of the coefficients is not equal to zero
F = 2.497; p-value is between .05 and .1; do not reject H0 (critical F = 3.13)
PTS: 1
12. Multiple regression analysis was used to study the relationship between a dependent variable, Y, and
three independent variables X1, X2 and, X3. The following is a partial result of the regression analysis
involving 20 observations.
Intercept
X1
X2
X3
Coefficient
20.00
15.00
8.00
-18.00
Standard Error
5.00
3.00
5.00
10.00
DF
SS
Analysis of Variance
Source
Regression
Error
a.
b.
c.
d.
e.
320
ANS:
a.
MS
80
0.42857
b.
c.
d.
e.
PTS: 1
13. Multiple regression analysis was used to study the relationship between a dependent variable, Y, and
four independent variables; X1, X2, X3 and, X4. The following is a partial result of the regression
analysis involving 31 observations.
Intercept
X1
X2
X3
X4
Coefficient
18.00
12.00
24.00
-36.00
16.00
Standard Error
6.00
8.00
48.00
36.00
2.00
df
SS
Analysis of Variance
Source
Regression
Error
Total
a.
b.
c.
d.
MS
125
760
ANS:
a.
b.
c.
d.
0.6579
t = 1.5; p-value is between 0.1 and 0.2; do not reject H0; not significant (critical t = 2.056)
t = 8; p-value < .01; reject H0; significant (critical t = 2.056)
F = 12.5; p-value < .01; reject H0; significant (critical F = 2.76)
PTS: 1
14. A regression model relating a dependent variable, Y, with one independent variable, X 1, resulted in an
SSE of 400. Another regression model with the same dependent variable, Y, and two independent
variables, X1 and X2, resulted in an SSE of 320. At = .05, determine if X2 contributed significantly to
the model. The sample size for both models was 20.
ANS:
F = 4.25; p-value is between .05 and .1; do not reject H0; X2 does not contribute to the model
significantly (critical F = 4.45)
PTS: 1
15. A regression model with one independent variable, X 1, resulted in an SSE of 50. When a second
independent variable, X2, was added to the model, the SSE was reduced to 40. At = 0.05, determine
if X2 contributes significantly to the model. The sample size for both models was 30.
ANS:
F = 6.75; p-value is between .01 and .025; reject H0; X2 contributes significantly (critical F = 4.21)
PTS: 1
16. When a regression model was developed relating sales (Y) of a company to its product's price (X 1), the
SSE was determined to be 495. A second regression model relating sales (Y) to product's price (X 1)
and competitor's product price (X2) resulted in an SSE of 396. At = 0.05, determine if the
competitor's product's price contributed significantly to the model. The sample size for both models
was 33.
ANS:
F = 7.5; p-value is between .01 and .025; reject H0; X2 contributes significantly to the model (critical F
= 4.17)
PTS: 1
17. A regression model relating units sold (Y), price (X1), and whether or not promotion was used (X2 = 1
if promotion was used and 0 if it was not) resulted in the following model.
Sb1 = .01
Sb2 = 0.1
ANS:
a.
b.
t = -3; p-value is between .01 and .02; reject H0; significant (critical t = 2.179)
t = 7; p-value < .01; reject H0; significant (critical t = 2.179)
PTS: 1
18. A regression model relating the yearly income (Y), age (X 1), and the gender of the faculty member of a
university (X2 = 1 if female and 0 if male) resulted in the following information.
n = 20
Sb1 = 0.2
a.
b.
SSE = 500
Sb2 = 0.1
SSR = 1,500
ANS:
a.
b.
PTS: 1
19. A regression analysis was applied in order to determine the relationship between a dependent variable
and 8 independent variables. The following information was obtained from the regression analysis.
R Square = 0.80
SSR = 4,280
Total number of observations n = 56
a.
b.
Source of
Variation
Regression
Error
Degrees
of Freedom
_____?
_____?
Sum of
Squares
_____?
_____?
_____?
_____?
Degrees
of Freedom
8
47
Sum of
Squares
4280
1070
55
5350
Total
Mean
Squares
_____?
_____?
F
_____?
Mean
Squares
535
22.77
F
24.49
ANS:
a.
Source of
Variation
Regression
Error (Residual)
Total
b.
PTS: 1
20. In a regression analysis involving 18 observations and four independent variables, the following
information was obtained.
Multiple R = 0.6000
R Square = 0.3600
Standard Error = 4.8000
Based on the above information, fill in all the blanks in the following ANOVA table.
Source of
Variation
Regression
Error
ANS:
ANALYSIS OF VARIANCE
Degrees
Sum of
Mean
of Freedom
Squares
Squares
_____?
_____?
_____?
_____?
_____?
_____?
F
_____?
Source of
Variation
Regression
Error
PTS: 1
Degrees
of Freedom
4
13
Sum of
Squares
168.48
299.52
Mean
Squares
42.12
23.04
F
1.828
21. The following are partial results of a regression analysis involving sales (Y in millions of dollars),
advertising expenditures (X1 in thousands of dollars), and number of salespeople (X2) for a
corporation. The regression was performed on a sample of 10 observations.
Coefficient
50.00
3.60
0.20
Constant
X1
X2
a.
b.
Standard Error
20.00
1.20
0.20
ANS:
a.
b.
PTS: 1
22. A regression analysis was applied in order to determine the relationship between a dependent variable
and 4 independent variables. The following information was obtained from the regression analysis.
R Square = 0.80
SSR = 680
Total number of observations n = 45
a.
b.
Source of
Variation
Regression
Error (Residual)
Total
Degrees
of Freedom
_____?
_____?
Sum of
Squares
_____?
_____?
_____?
_____?
Degrees
of Freedom
4
40
Sum of
Squares
680
170
44
850
Mean
Squares
_____?
_____?
F
_____?
ANS:
a.
Source of
Variation
Regression
Error (Residual)
Total
Mean
Squares
170.00
4.25
F
40
b. F = 40; p-value < .01 (almost zero); reject H 0; the model is significant (critical F = 2.61)
PTS: 1
23. A regression analysis (involving 45 observations) relating a dependent variable (Y) and two
independent variables resulted in the following information.
24. A computer manufacturer has developed a regression model relating Sales (Y in $10,000) with four
independent variables. The four independent variables are Price (in dollars), Competitor's Price (in
dollars), Advertising (in $1000) and Type of computer produced (Type = 0 if desktop, Type = 1 if
laptop). Part of the regression results are shown below.
ANOVA
Regression
Residual
Intercept
Price
Competitor's Price
Advertising
Type
a.
b.
c.
d.
e.
f.
df
4
35
SS
27641631.121
42277876.624
MS
6910407.780
1207939.332
t Stat
ANS:
a. 40
b. R Square = 0.3953
c.
Variable
t Stat
Price
-2.540
Competitor's Price
3.058
Advertising
2.727
Type
1.521
d.
Variable
p-values
Price
between .01and .02
Competitor's Pricce
<.01
Advertising
<.01
Type
between .1 and .2
e. Price, Competitor's Price, and Advertising are significant variables, because their p-values
are less than 0.05. Type is not significant, it's p-value is greater than 0.05. (critical t = 2.030)
f. F = 5.721; p-value < 0.01; reject H0; the model is significant.
PTS: 1
25. Thirty-four observations of a dependent variable (Y) and two independent variables resulted in an SSE
of 300. When a third independent variable was added to the model, the SSE was reduced to 250. At
95% confidence, determine whether or not the third independent variable contributes significantly to
the model.
ANS:
F = 6; p-value is between .025 and .01; reject H0; the added variable contributes significantly (critical
F = 4.17)
PTS: 1
26. Forty-eight observations of a dependent variable (Y) and five independent variables resulted in an SSE
of 438. When two additional independent variables were added to the model, the SSE was reduced to
375. At 95% confidence, determine whether or not the two additional independent variables contribute
significantly to the model.
ANS:
F = 3.36; p-value is between .025 and .05; reject H0; the two added variables contribute significantly
(critical t = 3.23)
PTS: 1
27. A regression analysis was applied in order to determine the relationship between a dependent variable
and 4 independent variables. The following information was obtained from the regression analysis.
R Square = 0.60
SSR = 4,800
Total number of observations n = 35
a.
b.
Source of
Degrees
Sum of
Mean
Variation
Regression
of Freedom
_____?
Squares
_____?
Squares
_____?
Error (Residual)
_____?
_____?
_____?
Total
_____?
_____?
_____?
ANS:
a.
Source of
Degrees
Sum of
Mean
Variation
of Freedom
Squares
Squares
Regression
4
4800
1200.00
Error (Residual)
30
3200
106.67
Total
34
8000
b. F = 11.25; p-value < .01; reject H0; the model is significant (critical F = 2.69)
PTS: 1
F
11.25
28. A regression analysis relating a companys sales, their advertising expenditure, price, and time resulted
in the following.
Regression Statistics
Multiple R
R Square
Adjusted R Square
Standard Error
Observations
0.8800
0.7744
0.7560
232.29
25
ANOVA
Regression
Residual
Total
Intercept
Advertising (X1)
Price (X2)
Time (X3)
a.
b.
c.
df
3
21
24
SS
53184931.86
1133108.30
54318040.16
Coefficients
927.23
1.02
15.61
170.53
Standard
Error
1229.86
3.09
5.62
28.18
MS
17728310.62
53957.54
F
328.56
t Stat
0.75
0.33
2.78
6.05
P-value
0.4593
0.7450
0.0112
0.0000
Significance F
0.0000
At 95% confidence, determine whether or not the regression model is significant. Fully
explain how you arrived at your conclusion (give numerical reasoning) and what your answer
indicates.
At 95% confidence determine which variables are significant and which are not. Explain how
you arrived at your conclusion (Give numerical reasoning).
Fully explain the meaning of R-square, which is given in this model. Be very specific and give
numerical explanation.
ANS:
a. Since significance F = 0.0000 < = 0.05, the model is significant.
b.
c.
The p-values for Price and Time are < = 0.05, therefore those are the significant variables.
R-Square = 0.7744, indicating 77.44% of variation in Sales is explained by variation in Price,
Time, and Advertising. There is 22.56% unexplained variation.
PTS: 1
29. Ziba, Inc. has provided the following information regarding their sales for January through December
of 2009. (Part of the data file is shown below.)
Year 2009
January
February
.
.
November
December
Sales
(Y in $100,000)
.
.
.
.
36
37
Advertising
(X1 in $10,000)
.
.
..
..
26
28
Time
(X2)
1
2
.
.
11
12
The results of the regression analysis relating these variables are shown below.
Intercept
Advertising (X1 in 10,000)
Time (X2)
a.
b.
Coefficients
13.01
0.31
1.39
Standard Error
0.6334
0.1293
0.2863
t Stat
20.5379
2.3784
4.8390
P-value
0.0000
0.0413
0.0009
The company is planning to increase their advertising by 5% per month for the months of
January and February of 2010. What would be the advertising for January and February of
2010? Give your answers in dollars.
Use the regression model that is provided above and forecast sales for January and February of
2010, assuming the company increases their advertising by 5% per month for the months of
January and February of 2010. Show your computations and write your answers in dollars
below.
ANS:
a. Advertising
January 2010
February 2010
b. Sales
January 2010
February 2010
PTS: 1
$294,000
$308,700
$40,194,000.00
$42,039,700.00
TOP: Regression - Model Building