Está en la página 1de 27

Forecasting Methods

Forecasting
Methods

Chapter 18
Time Series Analysis

Quantitative

Time Series and Time Series Methods


Components of a Time Series
Smoothing Methods
Trend Projection
Trend and Seasonal Components
Regression Analysis
ARIMA Model
Qualitative Approaches to Forecasting

Causal

Smoothing

Slide 1

Time Series

Trend
Projection

Trend Projection
Adjusted for
Seasonal Influence

Regression
Analysis

(No trend, cyclical, or


seasonal component)

Time Series and Time Series Methods

Qualitative

Slide 2

Time Series Methods

By reviewing historical data over time, we can better


understand the pattern of past behavior of a variable
and better predict the future behavior.
A time series is a set of observations on a variable
measured over successive points in time or over
successive periods of time.
The objective of time series methods is to discover a
pattern in the historical data and then extrapolate the
pattern into the future.
The forecast is based solely on past values of the
variable and/or past forecast errors.

Slide 3

Three time series methods are:


smoothing
trend projection
trend projection adjusted for seasonal influence
The pattern or behavior of the data in a time series
has several components.
Trend Component
Cyclical Component
Seasonal Component
Irregular Component

Slide 4

The Components of a Time Series

The Components of a Time Series

Trend Component
It represents a gradual shifting of a time series to
relatively higher or lower values over time.
Trend is usually the result of changes in the
population, demographics, technology, and/or
consumer preferences.

Cyclical Component
Any regular pattern of sequences of values above
and below the trend line lasting more than one
year can be attributed to the cyclical component.
Usually, this component is due to multiyear
cyclical movements in the economy.
Example, periods of moderate inflation followed
by periods of rapid inflation can lead to time
series that alternate below and above a generally
increasing trend line (e.g., a time series for
housing costs).

Slide 5

Slide 6

The Components of a Time Series

The Components of a Time Series

Trend and Cyclical Components of a Time Series


with Data Points One Year Apart

Slide 7

Seasonal Component
The seasonal component accounts for regular
patterns of variability within certain time periods,
such as a year.
The variability does not always correspond with
the seasons of the year (i.e. winter, spring,
summer, fall). There can be, for example, withinweek or within-day seasonal behavior.
Example, daily traffic volume data show withinthe-day seasonal behavior : peak levels (rush
hours), moderate flow, light flow.

Slide 8

Forecast Accuracy

The Components of a Time Series

Irregular Component
The irregular component is caused by short-term,
unanticipated and non-recurring factors that affect
the values of the time series.
This component is the residual, or catch-all,
factor that accounts for unexpected data values.
It is unpredictable.

Mean Squared Error (MSE)


It is the average of the sum of all the squared
forecast errors.
T

MSE =

t =1

( yt y t ) 2
T

Mean Absolute Deviation (MAD)


It is the average of the absolute values of all the
forecast errors.

MAD =

t =1

| yt y t |
T

One major difference between MSE and MAD is that


the MSE measure is influenced much more by large
forecast errors than by small errors.
Slide 9

Slide 10

Using Smoothing Methods in Forecasting

Smoothing Methods

In cases in which the time series is fairly stable and


has no significant trend, seasonal, or cyclical effects,
one can use smoothing methods to average out the
irregular component of the time series.

Three common smoothing methods are:

Moving Averages
We use the average of the most recent n data
values in the time series as the forecast for the next
period.
The moving average calculation is
Moving Average = (most recent n data values)/n

Moving Averages
Weighted Moving Averages
Exponential Smoothing

Slide 11

Slide 12

Example: Gasoline Sales

Example: Gasoline Sales

Data and Time Series Plot

Summary of Three-Week Moving Average


Calculations

Slide 13

Slide 14

Example: Gasoline Sales

Example: Rosco Drugs

Sales of Comfort brand headache medicine for the


past ten weeks at Rosco Drugs are shown below. If
Rosco Drugs uses a 3-period moving average to
forecast sales, what is the forecast for Week 11?
Week
1
2
3
4
5

Slide 15

Sales
110
115
125
120
125

Week
6
7
8
9
10

Sales
120
130
115
110
130

Slide 16

Example: Rosco Drugs

Using Smoothing Methods in Forecasting

Week

Sales

1
2
3
4
5
6
7
8
9
10
11

110
115
125
120
125
120
130
115
110
130

3MA

Forecast
(110 + 115 + 125)/3

116.7
120.0
123.3
121.7
125.0
121.7
118.3
118.3

116.7
120.0
123.3
121.7
125.0
121.7
118.3
118.3

Weighted Moving Averages


To use this method we must first select the
number of data values to be included in the
average.
Next, we must choose the weight for each of the
data values.
The more recent observations are typically
given more weight than older observations.
For convenience, the weights usually sum to 1.

Slide 17

Slide 18

Using Smoothing Methods in Forecasting

Using Smoothing Methods in Forecasting

For example, a 3-period weighted moving average


would be computed as follows.
Ft + 1 = .5(125) + .3(115) + .2(110) = 119

In general,
Ft + 1 = w1(Yt ) + w2(Yt - 1) + w3(Yt - 2) + ...

Exponential Smoothing
It is a special case of the weighted moving
averages method in which we select only the
weight for the most recent observation.
The weight placed on the most recent observation
is the value of the smoothing constant, .
The weights for the other data values are
computed automatically and become smaller at an
exponential rate as the observations become older.

where the sum of the weights (w values) is 1.

Slide 19

Slide 20

Using Smoothing Methods in Forecasting

Using Smoothing Methods in Forecasting

Exponential Smoothing

Ft + 1 = Yt + (1 - )Ft
where

Ft + 1 = forecast value for period t + 1


Yt = actual value for period t + 1
Ft = forecast value for period t
(To start the calculations, we let F1 = Y1)
= smoothing constant (0 < < 1)

Exponential Smoothing
With some algebraic manipulation, we can rewrite
Ft+1 = Yt + (1 )Ft as:
Ft+1 = Ft + (Yt Ft)

We see that the new forecast Ft+1 is equal to the

previous forecast Ft plus an adjustment, which is


times the most recent forecast error, Yt Ft.

Slide 21

Slide 22

Example: Rosco Drugs

Example: Rosco Drugs

Sales of Comfort brand headache medicine for the


past ten weeks at Rosco Drugs are shown on the next
slide. If Rosco Drugs uses exponential smoothing to
forecast sales, which value for the smoothing
constant, .1 or .8, gives better forecasts?
Week
1
2
3
4
5

Sales
110
115
125
120
125

Week
6
7
8
9
10

Sales
120
130
115
110
130
Slide 23

Exponential Smoothing ( = .1, 1 - = .9)


F1
F2 = .1Y1 + .9F1 = .1(110) + .9(110)
F3 = .1Y2 + .9F2 = .1(115) + .9(110)
F4 = .1Y3 + .9F3 = .1(125) + .9(110.5)
F5 = .1Y4 + .9F4 = .1(120) + .9(111.95)
F6 = .1Y5 + .9F5 = .1(125) + .9(112.76)
F7 = .1Y6 + .9F6 = .1(120) + .9(113.98)
F8 = .1Y7 + .9F7 = .1(130) + .9(114.58)
F9 = .1Y8 + .9F8 = .1(115) + .9(116.12)
F10= .1Y9 + .9F9 = .1(110) + .9(116.01)

= 110
= 110
= 110.5
= 111.95
= 112.76
= 113.98
= 114.58
= 116.12
= 116.01
= 115.41

Slide 24

Example: Rosco Drugs

Example: Rosco Drugs

Exponential Smoothing ( = .8, 1 - = .2)

F1
= 110
F2 = .8(110) + .2(110) = 110
F3 = .8(115) + .2(110) = 114
F4 = .8(125) + .2(114) = 122.80
F5 = .8(120) + .2(122.80) = 120.56
F6 = .8(125) + .2(120.56) = 124.11
F7 = .8(120) + .2(124.11) = 120.82
F8 = .8(130) + .2(120.82) = 128.16
F9 = .8(115) + .2(128.16) = 117.63
F10= .8(110) + .2(117.63) = 111.53

Mean Squared Error


In order to determine which smoothing constant
gives the better performance, we calculate, for
each, the mean squared error for the nine weeks of
forecasts, weeks 2 through 10.
[(Y2-F2)2 + (Y3-F3)2 + (Y4-F4)2 + . . . + (Y10-F10)2]/9

Slide 25

Slide 26

Example: Rosco Drugs

Week

Yt

2
3
4
5
6
7
8
9
10

115
125
120
125
120
130
115
110
130
MSE

= .1
Ft
(Yt - Ft)2
110.00
110.50
111.95
112.76
113.98
114.58
116.12
116.01
115.41

25.00
210.25
64.80
149.94
36.25
237.73
1.26
36.12
212.87

Sum
974.22
Sum/9 108.25

Example: Gasoline Sales

= .8
Ft
(Yt - Ft)2
110.00
114.00
122.80
120.56
124.11
120.82
128.16
117.63
111.53

Summary of the Exponential Smoothing Forecasts


for Gasoline Sales with Smoothing Constant = .2

25.00
121.00
7.84
19.71
16.91
84.23
173.30
58.26
341.27

Sum
847.52
Sum/9 94.17
Slide 27

Slide 28

Example: Gasoline Sales

Example: Gasoline Sales

Actual and Forecast Gasoline Sales Time Series with


Smoothing Constant = .2

MSE Computations for Forecasting Gasoline Sales


with = .2

Slide 29

Example: Gasoline Sales

Slide 30

SPSS(Moving Average)

MSE Computations for Forecasting Gasoline Sales


with = .3

Transfer
Create Time Series
New Variable(s): sales
Function: Prior moving average
Span: 3
Change
OK

Slide 31

Slide 32

SPSS(Exponential Smoothing)

Questions

Analyze
Time Series
Exponential Smoothing
Variables: sales
Parameters
General (Alpha)
Value: 0.2
Initial Values:
Custom:
Starting: 17
Trend: 0
Continue
OK

1. The time series component which reflects a regular, multi-year


pattern of being above and below the trend line is
a.
a trend
b. seasonal
c.
cyclical
d. irregular
2. Below you are given the first four values of a time series.
Time Period:
1
2
3
Time Series Value: 18 20 25
Using a 3-period moving average, the forecasted value for
period 4 is
a.
19.72
b. 20
c.
21
d. 25

Smoothing Parameters
Series
sales

Alpha (Level)
.20000

Sums of
Squared Errors
98.80454

df error
11

Slide 33

Slide 34

Questions

Using Trend Projection in Forecasting

3. Below you are given the first four values of a time


series.
Time Period:
1
2
3
Time Series Value: 18
20 25
Using a exponential smoothing to get the forecasted
value for period 4 ( = 0.2)
a. 19.72
b. 20
c. 21
d. 25

Slide 35

If a time series exhibits a linear trend, the method of


least squares may be used to determine a trend line
(projection) for future forecasts.

Equation for Linear Trend


Tt = b0 + b1t
where
Tt = trend value in period t
b0 = intercept of the trend line
b1 = slope of the trend line
t = time
Note: t is the independent variable.
Slide 36

Using Trend Projection in Forecasting


Computing the Slope (b1) and Intercept (b0)

Some Possible Forms of Nonlinear Trend Patterns

b1 = tYt - (t Yt)/n
t 2 - (t )2/n
b0 = (Yt/n) - b1t/n = Y - b1t
where
Yt = actual value in period t
n = number of periods in time series

Slide 37

Slide 38

Example: Sailboat Sales, Inc.

Example: Sailboat Sales, Inc.

Sailboat Sales is a major marine dealer in Chicago.


The firm has experienced tremendous sales growth in
the past several years. Management would like to
develop a forecasting method that would enable
them to better control inventories.
The annual sales, in number of boats, for one
particular sailboat model for the past five years are:

40

30

20

Year
Sales

1
11

2
14

3
20

4
26

5
34

SALES

Using Trend Projection in Forecasting

10
0

TIME

Slide 39

Slide 40

Example: Sailboat Sales, Inc.

Example: Sailboat Sales, Inc.

Linear Trend Equation

Total

t
1
2
3
4
5
15

Yt
11
14
20
26
34
105

tYt
11
28
60
104
170
373

Trend Projection
b1 = 373 - (15)(105)/5 = 5.8
55 - (15)2/5

t2
1
4
9
16
25
55

b0 = 105/5 - 5.8(15/5) = 3.6


Tt = 3.6 + 5.8t
T6 = 3.6 + 5.8(6) = 38.4

Slide 41

Slide 42

Example: Bicycle Sales

Example: Bicycle Sales

Slide 43

Trend Represented by a Linear Function for Bicycle


Sales

Slide 44

SPSS

SPSS

Analyze
Regression
Linear
Dependent: Sales
Independent (s): Week
OK

Model Summary
Model
R
R Square
1
.875a
.765
a. Predictors: (Constant), week

Adjusted R
Square
.735

Std. Error of
the Estimate
1.95895

Mean Square
99.825
3.838

F
26.013

ANOVA b
Model
1

Sum of Squares
Regression
99.825
Residual
30.700
Total
130.525
a. Predictors: (Constant), week
b. Dependent Variable: sales



(D): Sales
(I): Week

df
1
8
9

Coefficients
Unstandardized
Coefficients
Model
B
Std. Error
1
(Constant)
20.400
1.338
week
1.100
.216
a. Dependent Variable: sales

Standardized
Coefficients
Beta
.875

Slide 45

t
15.244
5.100

Sig.
.000
.001

Slide 46

Trend and Seasonal Components


in Forecasting

Sig.
.001a

Multiplicative Model

Multiplicative Model
Calculating the Seasonal Indexes
Deseasonalizing the Time Series
Using the Deseasonalizing Time Series to Identify
Trend
Seasonal Adjustments
Cyclical Component

Using Tt , St , and It to identify the trend, seasonal,


and irregular components at time t, we describe the
time series value Yt by the following multiplicative
time series model:
Y t = Tt x S t x I t

Slide 47

Tt is measured in units of the item being forecast.


St and It are measured in relative terms, with values
above 1.00 indicating effects above the trend and
values below 1.00 indicating effects below the trend.

Slide 48

Steps of Multiplicative Time Series Model


1.
2.
3.
4.
5.
6.
7.
8.
9.

Deseasonalizing the Time Series

Calculate the centered moving averages (CMAs).


Center the CMAs on integer-valued periods.
Determine the seasonal and irregular factors (StIt ).
Determine the average seasonal factors.
Scale the seasonal factors (St ).
Determine the deseasonalized data.
Determine a trend line of the deseasonalized data.
Determine the deseasonalized predictions.
Take into account the seasonality.

The purpose of finding seasonal indexes is to remove


the seasonal effects from the time series.
This process is called deseasonalizing the time series.
By dividing each time series observation by the
corresponding seasonal index, the result is a
deseasonalized time series.
With deseasonalized data, relevant comparisons can
be made between observations in successive periods.

Slide 49

Slide 50

Using the Deseasonalizing Time Series


to Identify Trend

Seasonal Adjustments

To identify the linear trend, we use the linear


regression procedure covered earlier; in this case, the
data are the deseasonalized time series values.
In other words, Yt now refers to the deseasonalized
time series value at time t and not to the actual value
of the time series.
The resulting line equation is used to make trend
projections, as it was earlier.

Slide 51

The final step in developing the forecast is to use the


seasonal index to adjust the trend projection.
The forecast for period t, season s, is obtained by
multiplying the trend projection for period t by the
seasonal index for season s.
Yt,s = Is[b0 + b1(t )]

Slide 52

Example: Terrys Tie Shop

Example: Terrys Tie Shop

Business at Terry's Tie Shop can be viewed as falling


into three distinct seasons: (1) Christmas (November
and December); (2) Father's Day (late May to mid
June); and (3) all other times. Average weekly sales
($) during each of the three seasons during the past
four years are shown on the next slide.

Year
1
2
3
4

Season
1
2
1856 2012
1995 2168
2241 2306
2280 2408

3
985
1072
1105
1120

Determine a forecast for the average weekly sales in


year 5 for each of the three seasons.

Slide 53

Slide 54

Example: Terrys Tie Shop

Example: Terrys Tie Shop

Step 1. Calculate the centered moving averages


There are three distinct seasons in each year.
Hence, take a three-season moving average to
eliminate seasonal and irregular factors. For
example:

1st CMA = (1856 + 2012 + 985)/3 = 1617.67


2nd CMA = (2012 + 985 + 1995)/3 = 1664.00
Etc.

Slide 55

Step 2. Center the CMAs on integer-valued periods.


The first centered moving average computed in
step 1 (1617.67) will be centered on season 2 of
year 1.
Note that the moving averages from step 1 center
themselves on integer-valued periods because n is
an odd number.

Slide 56

Example: Terrys Tie Shop


Moving
Dollar
Year Season Sales (Yt) Average
1
2
3
4

1
2
3
1
2
3
1
2
3
1
2
3

1856
2012
985
1995
2168
1072
2241
2306
1105
2280
2408
1120

Example: Terrys Tie Shop

(1856 + 2012 + 985)/3

1617.67
1664.00
1716.00
1745.00
1827.00
1873.00
1884.00
1897.00
1931.00
1936.00

Step 3. Determine the seasonal & irregular factors (St


It ).
Isolate the trend and cyclical components. For
each period t, this is given by:
St It = Yt /(Moving Average for period t )
(Yt = Tt x St x It)

Slide 57

Slide 58

Example: Terrys Tie Shop


Moving
Dollar
Year Season Sales (Yt) Average
1
2
3
4

1
2
3
1
2
3
1
2
3
1
2
3

1856
2012
985
1995
2168
1072
2241
2306
1105
2280
2408
1120

1617.67
1664.00
1716.00
1745.00
1827.00
1873.00
1884.00
1897.00
1931.00
1936.00

Example: Terrys Tie Shop

S tI t

2012/1617.67

1.244
.592
1.163
1.242
.587
1.196
1.224
.582
1.181
1.244

Step 4. Determine the average seasonal factors.


Averaging all St It values corresponding to that
season:
Season 1: (1.163 + 1.196 + 1.181) /3
= 1.180
Season 2: (1.244 + 1.242 + 1.224 + 1.244) /4 = 1.238
Season 3: (.592 + .587 + .582) /3
= .587
3.005

Slide 59

Slide 60

Example: Terrys Tie Shop

Example: Terrys Tie Shop

Step 5. Scale the seasonal factors (St ).


Average the seasonal factors = (1.180 + 1.238 +
.587)/3 = 1.002. Then, divide each seasonal factor
by the average of the seasonal factors.
Season 1: 1.180/1.002 = 1.178
Season 2: 1.238/1.002 = 1.236
Season 3: .587/1.002 = .586

Moving
Dollar
Year Season Sales (Yt) Average
1
2
3

3.000

1
2
3
1
2
3
1
2
3
1
2
3

1856
2012
985
1995
2168
1072
2241
2306
1105
2280
2408
1120

1617.67
1664.00
1716.00
1745.00
1827.00
1873.00
1884.00
1897.00
1931.00
1936.00

S tI t

Scaled
St
1.178
1.236
.586
1.178
1.236
.586
1.178
1.236
.586
1.178
1.236
.586

1.244
.592
1.163
1.242
.587
1.196
1.224
.582
1.181
1.244

Slide 61

Slide 62

Example: Terrys Tie Shop

Example: Terrys Tie Shop

Step 6. Determine the deseasonalized data.


Divide the data point values, Yt , by St .

Moving
Dollar
Year Season Sales (Yt) Average
1
2
3
4

Slide 63

1
2
3
1
2
3
1
2
3
1
2
3

1856
2012
985
1995
2168
1072
2241
2306
1105
2280
2408
1120

1617.67
1664.00
1716.00
1745.00
1827.00
1873.00
1884.00
1897.00
1931.00
1936.00

S tI t
1.244
.592
1.163
1.242
.587
1.196
1.224
.582
1.181
1.244

Scaled
St Yt/St
1.178
1.236
.586
1.178
1.236
.586
1.178
1.236
.586
1.178
1.236
.586

1576
1628
1681
1694
1754
1829
1902
1866
1886
1935
1948
1911
Slide 64

Example: Terrys Tie Shop

Example: Terrys Tie Shop

Step 7. Determine a trend line of the deseasonalized


data.
Using the least squares method for t = 1, 2, ..., 12,
gives:

Step 8. Determine the deseasonalized predictions


Substitute t = 13, 14, and 15 into the least squares
equation
T13 = 1580.11 + (33.96)(13) = 2022
T14 = 1580.11 + (33.96)(14) = 2056
T15 = 1580.11 + (33.96)(15) = 2090

Tt = 1580.11 + 33.96t

Slide 65

Slide 66

Example: Terrys Tie Shop

Example: Television Set Sales

Step 9. Take into account the seasonality.


Multiply each deseasonalized prediction by its
seasonal factor to give the following forecasts for
year 5:

Example: Television Set Sales (in thousands of units)

Season 1: (1.178)(2022) = 2382


Season 2: (1.236)(2056) = 2541
Season 3: ( .586)(2090) = 1225

Slide 67

Slide 68

Example: Television Set Sales

Example: Television Set Sales

Calculate the centered moving averages (CMAs)

Quarterly Television Set Sales Time Series and


Centered Moving Average

Slide 69

Slide 70

Example: Television Set Sales

Example: Television Set Sales

Seasonal Irregular Values: The Television Set Sales

Slide 71

The purpose of finding seasonal indexes is to remove


the seasonal effects from a time series. This process is
referred to as deseasonalizing the time series.
Seasonal Index Calculations: The Television Set Sales

Slide 72

Example: Television Set Sales

Example: Television Set Sales

Deseasonalized Values for The Television Set Sales


Time Series

Deseasonalized Television Set Sales Time Series

Slide 73

Slide 74

Example: Television Set Sales

SPSS

The linear trend component of the time series is


Tt = 5.101 + 0.148t
Quarterly Forecasts for The Television Set Sales

Step 1 & 2: Calculate the Centered Moving Average

Transfer
Create Time Series
New Variable(s): Sales
Function: Centered moving average
Span: 4
Change
OK

Slide 75

Slide 76

SPSS

SPSS

Step 3: Determine the seasonal & irregular factors (St


It ). (SIV)
Transfer
Compute
Target Variable: SIV
Numeric Expression: Sales / Sales_1
OK

Step 4: Determine the average seasonal factors.

Analyze
Descriptive Statistics
Explore
Dependent List: SIV
Factor List: Quarter
OK
1.00
Mean
.93

2.00
Mean
.84

3.00
Mean
1.09

4.00
Mean
1.14

Slide 77

Slide 78

SPSS

SPSS

Step 5. Scale the seasonal factors


Insert

SI

0.93
0.84
1.09
1.14
.....

Transfer
Compute
Target Variable: SIS
Numeric Expression: SI/((0.93+0.84+1.09+1.14)/4)
OK
Slide 79

Step 6. Determine the deseasonalized data (DS)

Transfer
Compute
Target Variable: DS
Numeric Expression: Sales / SIS
OK

Slide 80

SPSS

SPSS

Step 7. Determine a trend line of the deseasonalized data.


Insert
t

The linear trend component of the time series is


Tt = 5.101 + 0.148t

1
Coefficients

Unstandardized
Coefficients
Model
B
Std. Error
1
(Constant)
5.105
.113
t
.148
.012
a. Dependent Variable: DS

3
....
16

Standardized
Coefficients
Beta
.959

t
45.072
12.601

Sig.
.000
.000

Analyze
Regression
Linear
Dependent: DS
Independent (s): t
OK
Slide 81

Slide 82

Models Based on Monthly Data

Cyclical Component

Many businesses use monthly rather than quarterly


forecasts.
The preceding procedures can be applied with minor
modifications:
A 12-month moving average replaces the 4quarter moving average.
12 monthly, rather than 4 quarterly, seasonal
indexes must be computed.
Otherwise, the procedures are identical.

Slide 83

The multiplicative model can be expanded to include


a cyclical component that is expressed as a
percentage of trend.

Yt = Tt Ct St I t

However, there are difficulties in including a cyclical


component:
A cycle can span several (many) years and enough
data must be obtained to estimate the cyclical
component.
Cycles usually vary in length.

Slide 84

Regression Analysis

Regression Analysis

One or more independent variables can be used to


predict the value of a single dependent variable.
The time series value that we want to forecast is the
dependent variable.
The independent variable(s) might include any
combination of the following:
Previous values of the time series variable itself
Economic/demographic variables
Time variables

An autoregressive model is a regression model in


which the independent variables are previous values
of the time series being forecast.
A causal forecasting model uses other time series
related to the one being forecast in an effort to
explain the cause of a time series behavior.

Slide 85

Slide 86

Regression Analysis

Regression Analysis

For a function involving k independent variables, we


use the following notation:
Yt = value of the time series in period t
x1t = value of independent variable 1 in period t
x2t = value of independent variable 2 in period t

xkt = value of independent variable k in period t

Slide 87

In forecasting sales of refrigerators, we might select


the following five independent variables:
x1t = price of refrigerator in period t
x2t = total industry sales in period t - 1
x3t = number of new-house building permits
in period t - 1
x4t = population forecast for period t
x5t = advertising budget for period t

Slide 88

Regression Analysis

Univariate Time Series Models

The n periods of data necessary to develop the


estimated regression equation would appear as:

Period
(t)

Time Series
(Yt)

1
2
.
.
n

Y1
Y2
.
.
Yn

Value of Independent Variables


(x1t) (x2t) (x3t) . . (xkt)
x11
x12
.
.
x1n

x21
x22
.
.
x2n

x31
x32
.
.
x3n

.
.
.
.
.

.
.
.
.
.

xk1
xk2
.
.
xkn

Where we attempt to predict returns using only information


contained in their past values.
A Strictly Stationary Process
A strictly stationary process is one where
P{yt1 b1,..., ytn bn } = P{yt1 + m b1,..., ytn + m bn }
i.e. the probability measure for the sequence {yt} is the same as that for
{yt+m} m.
A Weakly Stationary Process
If a series satisfies the next three equations, it is said to be weakly or
covariance
stationary
1. E(yt) = , t = 1,2,...,
2. E ( yt )( yt ) = 2 <
3. E ( yt )( yt ) = t t t1 , t2
1

Slide 89

Slide 90

Moving Average Processes

Autoregressive Processes

Let ut (t=1,2,3,...) be a sequence of independently and


identically distributed (iid) random variables with E(ut)=0 and
Var(ut)= 2, then
yt = + ut + 1ut-1 + 2ut-2 + ... + qut-q
is a qth order moving average model MA(q).

An autoregressive model of order p, an AR(p) can be


expressed as

y t = + 1 y t 1 + 2 y t 2 + ... + p y t p + u t

Or using the lag operator notation:


Lyt = yt-1
Liyt = yt-i
p

y t = + i y t i + u t

Its properties are


E(yt)=; Var(yt) = 0 = (1+ 12 + 22 +...+ q2 )2
Covariances
( s + s +1 1 + s + 2 2 + ... + q q s ) 2
0 for s > q

s =

i =1
p

i
or y t = + i L y t + u t
i =1

for

s = 1,2,..., q
Slide 91

or ( L) y t = + u t

where ( L) = 1 (1 L + 2 L2 +... p Lp )
Slide 92

Building ARMA Models


- The Box Jenkins Approach

ARMA Processes

By combining the AR(p) and MA(q) models, we can obtain an


ARMA(p,q) model:

( L) y t = + ( L)u t
where

( L) = 1 1 L 2 L2 ... p Lp
and

( L) = 1 + 1 L + 2 L2 + ... + q Lq

or

y t = + 1 y t 1 + 2 y t 2 + ... + p y t p + 1u t 1 + 2 u t 2 + ... + q u t q + u t

with

E (u t ) = 0; E (u t2 ) = 2 ; E (u t u s ) = 0, t s

Box and Jenkins (1970) were the first to approach the task of
estimating an ARMA model in a systematic manner. There are
3 steps to their approach:
1. Identification
2. Estimation
3. Model diagnostic checking

Step 1:
- Involves determining the order of the model.
- Use of graphical procedures
- A better procedure is now available

Slide 93

Slide 94

Building ARMA Models


- The Box Jenkins Approach (contd)

ARMAX Models

Step 2:
- Estimation of the parameters
- Can be done using least squares or maximum likelihood
depending on the model.

yt = + 1 yt 1 + 1ut 1 + ut + xt

xy

yt = + 1 yt 1 + 1ut 1 + ut + xt 1

Step 3:
- Model checking
Box and Jenkins suggest 2 methods:
- deliberate overfitting
- residual diagnostics

ARMAARMAX

y
x

Slide 95

Slide 96

Example: (Regression)

ARIMA Models

As distinct from ARMA models. The I stands for


integrated.
An integrated autoregressive process is one with a
characteristic root on the unit circle.
Typically researchers difference the variable as necessary
and then build an ARMA model on those differenced
variables.
An ARMA(p,q) model in the variable differenced d times
is equivalent to an ARIMA(p,d,q) model on the original
data.

Model Summary
Model
R
R Square
1
.465a
.216
a. Predictors: (Constant), DOW_R
b. Dependent Variable: MIC_R

Adjusted R
Square
.212

Coefficients
Unstandardized
Coefficients
Model
B
Std. Error
1
(Constant)
1.192E-02
.007
DOW_R
1.144
.154
a. Dependent Variable: MIC_R

Std. Error of
the Estimate
.09221

Standardized
Coefficients
Beta

.465

1.805
7.443

Slide 97

Sig.
.073
.000

Slide 98

Example: (Regression + ARIMA(0,0,1))

.4

FINAL PARAMETERS:

.2

Number of residuals 203


Standard error
.09152457
Log likelihood
198.84691
AIC
-391.69383
SBC
-381.75421

-.0

Analysis of Variance:

-.2

DF Adj. Sum of Squares Residual Variance

Unstandardized Predi
-.4

cted Value

1.6755434

.00837675

SEB

T-RATIO APPROX. PROB.

MA1
.1524335 .06975689 2.1852102
DOW_R
1.1308487 .15280021 7.4008322
CONSTANT .0120568 .00559740 2.1539993

9
198
187
176
165
154
143
132
121
110
10
89
78
67
56
45
34
23
12
1

Sequence number

200

Variables in the Model:

MIC_R

-.6

Residuals

Slide 99

.03003531
.00000000
.03243638

Slide 100

.4

Forecasting in Econometrics

.2

RIMA, MOD_13 CON

Forecasting = prediction.
An important test of the adequacy of a model.
e.g.
- Forecasting tomorrows return on a particular share
- Forecasting the price of a house given its characteristics
- Forecasting the riskiness of a portfolio over the next year
- Forecasting the volatility of bond returns

MIC_R

-.0

-.2
Fit for MIC_R from A
-.4

-.6
9
198
187
176
165
154
143
132
121
110
10
89
78
67
56
45
34
23
12
1

Sequence number

We can distinguish two approaches:


- Econometric (structural) forecasting
- Time series forecasting

Slide 101

Slide 102

Qualitative Approaches to Forecasting

In-Sample Versus Out-of-Sample

Expect the forecast of the model to be good in-sample.

Say we have some data - e.g. monthly FTSE returns for 120
months: 1990M1 1999M12. We could use all of it to build
the model, or keep some observations back:

Slide 103

Delphi Method
It is an attempt to develop forecasts through group
consensus.
The goal is to produce a relatively narrow spread of
opinions within which the majority of the panel of experts
concur.
Three Steps:
1. A panel of experts, each of whom is physically
separated from the others and is anonymous, is asked
to respond to a sequential series of questionnaires.
2. After each questionnaire, the responses are tabulated
and the information and opinions of the entire group
are made known to each of the other panel members so
that they may revise their previous forecast response.
3. The process continues until some degree of consensus
is achieved.
Slide 104

Qualitative Approaches to Forecasting

Qualitative Approaches to Forecasting

Expert Judgment
Experts individually consider information that
they believe will influence the variable; then they
combine their conclusions into a forecast.
No two experts are likely to consider the same
information in the same way.

Slide 105

Assignments of Chapter 18

#32, #41, #44

Slide 107

Scenario Writing
This procedure involves developing several
conceptual scenarios, each based on a well-defined
set of assumptions.
After several different scenarios have been
developed, the decision maker determines which
is most likely to occur in the future and makes
decisions accordingly.
Intuitive Approaches
A committee or panel seeks to develop new ideas
or solve complex problems through a series of
brainstorming sessions.
Individuals are free to present any idea without
being concerned about criticism or relevancy.
Slide 106

También podría gustarte