Documentos de Académico
Documentos de Profesional
Documentos de Cultura
linear regression
Continuous outcome
(means)
Are the observations independent or correlated?
Outcome
Variable
independent
correlated
Alternatives if the
normality assumption is
violated (and small
sample size):
Continuous
(e.g. pain
scale,
cognitive
function)
Non-parametric statistics
Wilcoxon sign-rank
test: non-parametric
Repeated-measures
ANOVA: compares changes
Pearsons correlation
coefficient (linear
correlation): shows linear
Mixed models/GEE
modeling: multivariate
Linear regression:
regression techniques to
compare changes over time
between two or more groups;
multivariate regression
non-parametric alternative to
the ttest
Spearman rank
Recall: Covariance
n
cov ( x , y )
( x X )( y
i 1
n 1
Y )
Interpreting Covariance
cov(X,Y) > 0
cov(X,Y) < 0
cov(X,Y) = 0
Correlation coefficient
cov ariance( x, y )
r
var x var y
Correlation
Unit-less
r = -1
r=0
r = +1
r = -.6
r = +.3
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
r=0
Linear Correlation
Linear relationships
Y
Curvilinear relationships
Y
X
Y
X
Y
X
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
Linear Correlation
Strong relationships
Y
Weak relationships
Y
X
Y
X
Y
X
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
Linear Correlation
No relationship
Y
X
Y
X
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
Calculating by hand
n
( x x )( y
cov ariance( x, y )
r
var x var y
i 1
y)
n 1
n
2
(
x
x
)
i
2
(
y
y
)
i
n 1
n 1
i 1
i 1
Simpler calculation
formula
n
( x x )( y
i
i 1
y)
n 1
i 1
n 1
( x x )( y
i 1
y)2
n 1
i 1
y)
(x x) ( y
2
(x x) ( y
i 1
i 1
Numerator of
covariance
y)
SS xy
SS x SS y
SS xy
SS x SS y
Numerators of
variance
Distribution of the
correlation coefficient:
2
1 r
SE (r)
n2
The sample correlation coefficient follows a T-distribution
with n-2 degrees of freedom (since you have to estimate the
standard error).
Continuous outcome
(means)
Are the observations independent or correlated?
Outcome
Variable
independent
correlated
Alternatives if the
normality assumption is
violated (and small
sample size):
Continuous
(e.g. pain
scale,
cognitive
function)
Non-parametric statistics
Wilcoxon sign-rank
test: non-parametric
Repeated-measures
ANOVA: compares changes
Pearsons correlation
coefficient (linear
correlation): shows linear
Mixed models/GEE
modeling: multivariate
Linear regression:
regression techniques to
compare changes over time
between two or more groups;
multivariate regression
non-parametric alternative to
the ttest
Spearman rank
Linear regression
In correlation, the two variables are treated as equals. In regression,
one variable is considered independent (=predictor) variable (X) and
the other the dependent (=outcome) variable Y.
What is Linear?
Remember this:
Y=mX+B?
m
Whats Slope?
A slope of 2 means that every 1-unit change in X
yields a 2-unit change in Y.
Prediction
If you know something about X, this knowledge helps you
predict something about Y. (Sound familiar?sound
like conditional probabilities?)
Regression equation
Expected value of y at a given level of x=
E ( yi / xi ) xi
+ *xi
Fixed
exactly
on the
line
+ random errori
Follows a normal
distribution
Sy/x
Sy/x
Sy/x
Sy/x
Sy/x
Sy/x
Regression Picture
yi
y i xi
B
y
C
yi
x
n
(y
y)
i 1
SStotal
Total squared distance of
observations from nave
mean of y
Total variation
( y
i 1
y)
( y
yi ) 2
i 1
R2=SSreg/SStotal
SSreg
SSresidual
Variabilityduetox(regression)
Additionalvariabilitynotexplainedby
xwhatleastsquaresmethodaimsto
minimize
1. Lee DM, Tajar A, Ulubaev A, et al. Association between 25-hydroxyvitamin D levels and cognitive performance
in middle-aged and older European men. J Neurol Neurosurg Psychiatry. 2009 Jul;80(7):722-9.
Distribution of vitamin D
Mean= 63 nmol/L
Standard deviation = 33
nmol/L
Distribution of DSST
Normally distributed
Mean = 28 points
Standard deviation = 10 points
0
0.5 points per 10 nmol/L
1.0 points per 10 nmol/L
1.5 points per 10 nmol/L
Dataset 1: no relationship
Dataset 2: weak
relationship
Dataset 3: weak to
moderate relationship
Dataset 4: moderate
relationship
Regression
equation:
E(Yi) = 26 + 0.5*vit
Di (in 10 nmol/L)
Regression equation:
E(Yi) = 22 + 1.0*vit
Di (in 10 nmol/L)
Regression equation:
E(Yi) = 20 + 1.5*vit Di
(in 10 nmol/L)
Whats the constraint? We are trying to minimize the squared distance (hence the least squares) between the
observations themselves and the predicted values , or (also called the residuals, or left-over unexplained
variability)
Differencei = yi (x + )
Differencei2 = (yi (x + )) 2
Find the that gives the minimum sum of the squared differences. How do you maximize a function? Take the
n
n
derivative; set it equal
and solve. Typical max/min
problem from calculus.
d to zero;
2
(y
( xi )) 2(
i 1
2(
(y
xi )( x i ))
i 1
( y i xi x i xi )) 0...
i 1
Resulting formulas
Slope (beta coefficient) =
Intercept=
Cov ( x, y )
Var ( x)
Calculate : y - x
(x, y)
Relationship with
correlation
SD
x
r
SD y
In correlation, the two variables are treated as equals. In regression, one variable is considered
independent (=predictor) variable (X) and the other the dependent (=outcome) variable Y.
Example: dataset 4
SDx = 33 nmol/L
SDy= 10 points
Cov(X,Y) = 163
points*nmol/L
SS x
SS y
r = 163/(10*33) = 0.49
Or
r = 0.15 * (33/10) = 0.49
Significance testing
Slope
Distribution of slope ~ Tn-2(,s.e.(
))
H0: 1 = 0
H1: 1 0
exist)
Tn-2=
0
s.e.( )
(y
i 1
y i )
n2
SS x
where SSx ( xi x ) 2
i 1
and y i xi
sy / x
SS x
Example: dataset 4
Predicted values
y i 20 1.5 xi
For Vitamin D = 95 nmol/L (or 9.5 in 10 nmol/L):
y i 20 1.5(9.5) 34
Residual =
observed - predicted
X=95
nmol/L
34
yi 48
y i 34
yi y i 14
Not Linear
residuals
residuals
Linear
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
x
Non-constant variance
residuals
residuals
Constant variance
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
residuals
residuals
residuals
Independent
Slide from: Statistics for Managers Using Microsoft Excel 4th Edition, 2004 Prentice-Hall
Multiple linear
regression
Different 3D view
E(y)= + 1*X + 2 *W + 3 *Z
Each regression coefficient is the amount of
change in the outcome variable that would be
expected per one-unit change of the predictor, if
all other variables in the model were held
constant.
Functions of multivariate
analysis:
40 32.5 7.5
2
10.8 10.8
54
46
3.46; p .0008
As a linear regression
Intercept
represents the
mean value in
the sufficient
group.
Slope represents
the difference in
means between the
groups. Difference
is significant.
Parameter
Variable
Intercept
insuff
````````````````Standard
Estimate
Error
t Value
40.07407
-7.53060
1.47511
2.17493
27.17
-3.46
Pr > |t|
<.0001
0.0008
ANOVA is linear
regression!
The picture
Sufficient vs.
Insufficient
Sufficient vs.
Deficient
Results
Parameter Estimates
Variable
DF
Intercept
deficient
insufficient
1
1
1
Parameter
Estimate
40.07407
-9.87407
-6.87963
Standard
Error
1.47817
3.73950
2.33719
t Value
Pr > |t|
27.11
-2.64
-2.94
<.0001
0.0096
0.0041
Interpretation:
Logistic
Cox
Appropriate
multivariate
regression
model
Example equation
Continuous
Blood
pressure
Linear
regression
Binary
High blood
pressure
(yes/no)
Logistic
regression
Time-to-event
Time-todeath
Cox regression
ln (rate of death) =
+ salt*salt consumption (tsp/day) +
age*age (years) + smoker*ever
smoker (yes=1/no=0)
Outcome
(dependent
variable)
Multivariate regression
pitfalls
Multi-collinearity
Residual
confounding
Overfitting
Multicollinearity
Multicollinearity
Model
Residual confounding
Sinha R, Cross AJ, Graubard BI, Leitzmann MF, Schatzkin A. Meat intake and mortality: a prospective study of over half a million people. Arch Intern Med
Mortality risks
Sinha R, Cross AJ, Graubard BI, Leitzmann MF, Schatzkin A. Meat intake and mortality: a prospective study of over half a million people. Arch Intern Med
Overfitting
Parameter
Estimate
Standard
Error
Intercept
exercise
sleep
obama
Clinton
mathLove
11.80175
-0.29106
-1.91592
1.73993
-0.83128
0.45653
2.98341
0.09798
0.39494
0.24352
0.17066
0.10668
Type II SS
F Value
Pr > F
11.96067
6.74569
17.98818
39.01944
18.13489
13.99925
15.65
8.83
23.53
51.05
23.73
18.32
0.0019
0.0117
0.0004
<.0001
0.0004
0.0011
Exercise, sleep, and high ratings for Clinton are negatively related to optimism (highly
significant!) and high ratings for Obama and high love of math are positively related to
optimism (highly significant!).
If something seems to
good to be true
Clinton, univariate:
Variable
Label
Intercept
Clinton
Intercept
Clinton
DF
Parameter
Estimate
1
1
5.43688
0.24973
Standard
Error t Value
2.13476
0.27111
2.55
0.92
Pr > |t|
0.0188
0.3675
Sleep, Univariate:
Variable
Label
DF
Parameter
Estimate
Standard
Error t Value
Pr > |t|
Intercept Intercept 1
8.30817
4.36984
1.90
0.0711
sleep
1
-0.14484
0.65451
-0.22
0.8270
Exercise, Univariate: sleep
Parameter
Standard
Variable Label
DF
Estimate
Error t Value Pr > |t|
Intercept
exercise
Intercept
exercise
1
1
6.65189
0.19161
0.89153
0.20709
7.46
0.93
<.0001
0.3658
Label
Intercept
obama
Intercept
obama
DF
1
1
Parameter
Estimate
Standard
Error t Value
0.82107
0.87276
2.43137
0.31973
Pr > |t|
0.34
0.7389
2.73
0.0126
Label
DF
Intercept Intercept 1
mathLove mathLove
Parameter
Estimate
Standard
Error t Value
Pr > |t|
3.70270
1.25302
2.96
0.0076
1
0.59459
0.19225
3.09
0.0055
Compare
with
multivariate
result;
p<.0001
Compare
with
multivariate
result;
p=.0011
Overfitting
Rule of thumb: You need at
least 10 subjects for each
additional predictor
variable in the multivariate
regression model.
Binary predictor
Statistical procedure
or measure of association
Outcome variable
Cross-sectional/case-control studies
Continuous
Ranks/ordinal
T-test
Wilcoxon rank-sum test
Continuous
Continuous
ANOVA
Simple linear regression
Multivariate
(categorical and
continuous)
Categorical
Continuous
Categorical
Binary
Multivariate
Binary
Binary
Binary
Categorical
Binary
Time-to-event
Risk ratio
Kaplan-Meier/ log-rank test
Multivariate
Time-to-event
Categorical
Multivariate
Continuous
Continuous
Cox-proportional hazards
regression, hazard ratio
Repeated measures ANOVA
Mixed models; GEE modeling
Logistic regression
Alternative summary:
statistics for various types of
outcome
Are the data
observations independent
or correlated?
Outcome
Variable
independent
correlated
Assumption
s
Continuous
Ttest
ANOVA
Linear correlation
Linear regression
Paired ttest
Repeated-measures
ANOVA
Mixed models/GEE
modeling
Outcome is
normally distributed
(important for small
samples).
Outcome and
predictor have a
linear relationship.
Difference in
proportions
Relative risks
Chi-square test
McNemars test
Conditional logistic
regression
GEE modeling
Chi-square test
assumes sufficient
numbers in each
cell (>=5)
Binary or
categorical
(e.g. fracture
Continuous outcome
(means); HRP 259/HRP
262
Are the observations independent or correlated?
Outcome
Variable
Continuous
(e.g. pain
scale,
cognitive
function)
independent
correlated
Alternatives if the
normality assumption is
violated (and small
sample size):
Non-parametric statistics
Wilcoxon sign-rank
test: non-parametric
Repeated-measures
ANOVA: compares changes
Pearsons correlation
coefficient (linear
correlation): shows linear
Mixed models/GEE
modeling: multivariate
Linear regression:
regression techniques to
compare changes over time
between two or more groups;
multivariate regression
non-parametric alternative to
the ttest
Spearman rank
Binary or categorical
outcomes (proportions); HRP
259/HRP
261
Are the observations correlated?
Alternative to the
Outcom
e
Variable
independent
correlated
Binary or
categorical
(e.g.
fracture,
yes/no)
Chi-square test:
McNemars chi-square
test: compares binary
compares proportions
between two or more
groups
chi-square test if
sparse cells:
Conditional logistic
regression: multivariate
Logistic regression:
multivariate technique
used when outcome is
binary; gives multivariateadjusted odds ratios
Time-to-event outcome
(survival data); HRP 262
Are the observation groups independent or correlated?
Outcome
Variable
independent
correlated
Time-toevent (e.g.,
time to
fracture)
n/a (already
over time)
Cox regression: Multivariate technique for time-toevent data; gives multivariate-adjusted hazard ratios
Modifications
to Cox
regression if
proportionalhazards is
violated:
Time-dependent
predictors or
time-dependent
hazard ratios
(tricky!)