Está en la página 1de 4

1.Consider the following regression model: log(y)= β0+β1x1+β2x12+ β3x3+u.

This model will
suffer from functional form misspecification if _____.
a. β0 is omitted from the model
b. u is heteroskedastic
c. x12 is omitted from the model
d. x3 is a binary variable
Answer:c

2.A regression model suffers from functional form misspecification if _____.
a. a key variable is binary.
b. the dependent variable is binary.
c. an interaction term is omitted.
d. the coefficient of a key variable is zero.
Answer: c

3. Which of the following is true of Regression Specification Error Test (RESET)?
a. It tests if the functional form of a regression model is misspecified.
b. It detects the presence of dummy variables in a regression model.
c. It helps in the detection of heteroskedasticity when the functional form of the model is
correctly specified.
d. It helps in the detection of multicollinearity among the independent variables in a
regression model.
Answer:a

4. A proxy variable _____.
a. increases the error variance of a regression model
b. cannot contain binary information
c. is used when data on a key independent variable is unavailable
d. is detected by running the Davidson-MacKinnon test
Answer: c
5. Which of the following is a drawback of including proxy variables in a regression model?
a. It leads to misspecification analysis.
b. It reduces the error variance.
c. It increases the error variance.
d. It exacerbates multicollinearity.
Answer: d

6. The classical errors-in-variables (CEV) assumption is that _____.
a. the error term in a regression model is correlated with all observed explanatory variables
b. the error term in a regression model is uncorrelated with all observed explanatory variables

OLS is justified for very large sample sizes while LAD is justified for smaller sample sizes. all estimators are biased. Answer: b 8. OLS is designed to estimate the conditional median of the dependent variable while LAD is designed to estimate the conditional mean. The Least Absolute Deviations (LAD) estimators in a linear model minimize the sum of squared residuals. c. If measurement error in a dependent variable is correlated with the independent variables. Which of the following is true of measurement error? a. the ordinary least squares estimators are unbiased. OLS is more sensitive to outlying observations than LAD. the ordinary least squares estimators for the intercept are biased and inconsistent. the measurement error is uncorrelated with the unobserved explanatory variable Answer: d 7. b. Which of the following is a difference between least absolute deviations (LAD) and ordinary least squares (OLS) estimation? a. the measurement error is correlated with the unobserved explanatory variable d. If measurement error in an independent variable is uncorrelated with other independent variables. c.c. . If measurement error in an independent variable is uncorrelated with the variable. true b. d. false Answer: b 10. If measurement error in a dependent variable has zero mean. OLS is more computationally intensive than LAD. Answer: b 9. d. a. the ordinary least squares estimators are unbiased. b.

11. So e0 and tvhours* are likely to be correlated. For children who do not watch TV at all. where the measurement error e0 has zero mean and is uncorrelated with tvhours* and each explanatory variable in the equation. because it is the dependent variable that is measured with error.Answer: (i) For the CEV assumptions to hold. . we might argue directly that more highly educated parents tend to underreport how much television their children watch. and it is very likely that reported TV hours is zero. (Note that for OLS to consistently estimate the parameters we do not need e0 to be uncorrelated with tvhours*. As mentioned in part (i). Or. tvhours* = 0. we must be able to write tvhours = tvhours* + e0. e0 must satisfy e0  tvhours*. since tvhours  0. but. which means e0 and the education variables are negatively correlated. So if tvhours* = 0 then e0 = 0 with high probability. because tvhours* depends directly on the explanatory variables. what is important is that e0 is uncorrelated with the explanatory variables. But this is unlikely to be the case.) (ii) The CEV assumptions are unlikely to hold in this example. the measurement error can be positive or negative. If tvhours* > 0.

0126(%enroll).Answer: (i) Eligibility for the federally funded school lunch program is very tightly linked to being economically disadvantaged. This is a common finding in studies of school performance: family income (or related factors.126 percentage points. we are explaining almost 19% (which still leaves much variation unexplained). a ten percentage point increase in lnchprg leads to about a 3. omitting lnchprg (the proxy for poverty) from the regression produces an upward biased estimator of 1 [ignoring the presence of log(enroll) in the model]. less on schools. (iv) Both math10 and lnchprg are percentages. . Therefore. (ii) We can use our usual reasoning on omitting important variables from a regression equation. (v) In column (1) we are explaining very little of the variation in pass rates on the MEAP math test: less than 3%. Therefore. a sizeable effect. From Table 3. the effect of spending falls. a 10% increase in enrollment leads to a drop in math10 of . The variables log(expend) and lnchprg are negatively correlated: school districts with poorer children spend.  3 < 0. Further. such as living in poverty) are much more important in explaining student performance than are spending per student or other school characteristics. the coefficient on log(enroll) becomes negative and has a t of about –2. The coefficient implies that math10  (1. on average. So when we control for the poverty rate. Therefore.2.17. the percentage of students eligible for the lunch program is very similar to the percentage of students living in poverty.26/100)(%enroll) = . In column (2). which is significant at the 5% level against a two-sided alternative.23 percentage point fall in math10. Clearly most of the variation in math10 is explained by variation in lnchprg. (iii) Once we control for lnchprg.