Documentos de Académico
Documentos de Profesional
Documentos de Cultura
403 403
With 1 text-figure
Printed in Great Britain
SUMMARY
Time series which are encountered in meteorology exhibit non-stationary behaviour.
If a variable is observed at the same time each day, it appears to be stationary over a period
of a few weeks, but there is a seasonal variation in structure (Monin, 1963). For example,
1. INTRODUCTION
In the strict sense, a stochastic process, X(t), has periodic structure if the joint probability
distribution of X(t1),X(t2), ...,X(tn) is the same as the joint probability distribution of
X(t1 + kT),X(ti + kT),...,X(tn + kT) for any n time points tit any integer k, and some
positive number T called the period. A strictly stationary process is a special case since the
joint probability distributions are the same for any value of T. Gladyshev (1961, 1963)
defines a periodically correlated random sequence as a process which has periodic structure
in the usual wide sense. Assuming that E{X2(f)} is finite for all t, let m(t) = E{X(t)}. Then
m(t) = m(t + kT) for all t and any integer k. The covariance function
R(s,t) = E[{X(s)-m(s)}{X(t)-m(t)}]
satisfies R(s, t) = R(s + kT, t + kT) for all s and t and any integer k. If t is a continuous
parameter it will be assumed that X(t) is continuous in quadratic mean. Gladyshev (1963)
also shows that a stochastic process with periodic structure is harmonizable (Loeve, 1963,
p. 474) since it has the representation
X(t)-m(t)= LwdZiJ).
Here Z{f) is a random function with zero mean, and the limits of integration are taken as
( \, \) if t = 0, + 1, 2,..., and ( oo, oo) if t is a continuous parameter. The covariance
function of a harmonizable process has the representation
2. ATTTOREGRESSIVE REPRESENTATION
A practical method of predicting time series is to fit a finite autoregression by least squares.
If a discrete-time stationary process has a spectral density bounded away from zero, which
will always be true for observed series because of observational error or rounding after a
finite number of decimals, the spectrum can be approximated by a finite autoregression. If
the order of the autoregression is not decided before the data are analysed, this is, in a sense,
a non-parametric procedure for linear prediction. Using the step-wise procedure suggested
by Whittle (1963), autoregressions of increasing order can be estimated using a minimum of
calculation.
For a process with periodic structure, the autoregressive coefficients can vary throughout
the period; however, the coefficients must vary in a periodic manner. The use of time-varying
If the structure of the process varies slowly with time, am{t m) can be approximated by
a Fourier Series consisting of only a few terms,
X(t) + 2 S X(t - m) [amn cos {2nn(t - m)/T} + bmn sin {2nn{t - m)/T}] = e(t) (2-4)
m-1n-0
3. A MTTLTIVARTATE EXAMPLE
Estimation of the autoregressive parameters has been carried out on bivariate temperature
data consisting of daily maximum and minimum temperatures at New York Central Park.
In order to remove the periodic mean so we could concentrate on the second-order properties,
the data were divided into 4-year segments consisting of 1460 time points starting 1 March
The matrix F in (3-6) was inverted by a stage-wise partitioning procedure, which required
that only the r(0), F(l), ..., T{p 1) matrices and the inverse be stored. The inverse was
generated using the well-known partitioning formula for a symmetric matrix. At a typical
stage,
F(0)
r'(i) F(0) F(r-2) (3-7)
F'(r-2) rjo) J
is available from the previous step.
The estimated regression coefficients were obtained,
(3-8)
F - e(t)e'(t),
N
2 e(t)e'(t) cos (2TTTUIT), (3-10)
N
W = 2 e{t)e'(t)8va(27TntlT).
N-pk
To obtain a weight function in time with small side lobes these coefficients were weighted
by {1 + cos (i7m)}
V V 1
y v
0 0'
Fn = F n {l + cos(7m)}, (3-11)
Wn = ^ ^ { l + COSliTTO)},
and the weights calculated
7
n-1
The regression was then repeated using the estimated error variance for weighted least
squares.
4. RESULTS
From past analyses of the data, it was decided that three lags add significantly to a
prediction. For the first trial in this study, U(t) consisted of ten components involving two
harmonics. The use of three lags gave thirty predictors. An approximate t-statistic was
calculated for each estimated regression coefficient by dividing it by its estimated standard
deviation. After examining the results, it was decided that for lags two and three the second
Time series with periodic structure 407
harmonic did not contribute significantly to the reduction of the residual sum of squares;
the data were reanalysed using only one harmonic frequency for these lags. This gave
twenty-two predictors.
By assuming periodically varying autoregressive coefficients, there was a significant
reduction in the residual sum of squares for the miniTnuTn temperature, compared to fitting
constant autoregressive coefficients. The weighted least-squares analysis changed the
estimated regression coefficients slightly and decreased the error of estimation. This, of
course, would give the minimum variance linear unbiased estimate of the regression coeffi-
cients if the weights had been based on the true rather than the estimated error variance.
Figure 1 shows the estimated error variance for the maximum and minimum temperatures
as a function of time.
60
50
Maximum
40
30
20
10
From the regression coefficients, B{\), B(2) and .6(3), it is possible to calculate the auto-
regressive coefficients, (^(t), a2(t) and ag(t), in the equation
which are 2 x 2 matrices and periodic functions of time. This, in turn, defines a time-varying
spectrum from the usual formula for stationary processes
s(f,t) = [A(f,t)]-iV(t)[A*(f,t)]-\
where * denotes the conjugate transposed matrix and
A[f,t)= S ajt-m)e-f,
ao(t) being defined as the 2 x 2 identity matrix. A discussion of instantaneous spectra for
non-stationary processes is given by Priestley (1965).
26-1
408 R. H. JONES AND W. M. BBELSFORD
5. CONCLUSION
Linear prediction of time series with periodic structure can be achieved by regressing
the present on the past of the time series and on the past of time series constructed by
multiplying the data by sines and cosines of harmonic frequencies of the period. This reduces
the problem to the prediction of a subset of the components of a multivariate stationary
process using the past of all components as predictors. In the example presented here it was
assumed that the structure varied slowly with respect to the spacing of observations so that
the periodic structure could be approximated using Fourier series of only a few terms.
However, in some cases this assumption is not necessary. Often in meteorology, observations
are taken twice a day and there is a daily variation in structure. Considering only this daily
Support by the U.S. Air Force Office of Scientific Research is gratefully acknowledged.
REFERENCES
GLADYSHEV, E. G. (1961). Periodically correlated random sequences. Soviet Math. 2, 385-8.
GLADYSHEV, E. G. (1963). Periodically and almost periodically correlated random processes with
continuous time parameter. Theory Prob. Appl. 8, 173-7.
HARBISON, P. J. (1965). Short-term sales forecasting. Appl. Statist. 14, 102-39.
HERBST, L. J. (1963). Periodogram analysis and variancefluctuations.J. R. Statist. Soc. B 25, 422-50.
JONES, R. H. (1964). Spectral analysis and linear prediction of meteorological time series. J. Appl.
Meteor. 3, 45-52.
LOEVE, M. (1963). Probability Theory (third edition). Princeton: D. Van Nostrand.
MONXN, A. S. (1963). Stationary and Periodic time series in the general circulation of the atmosphere.
Proceedings of the Symposium on Time Series Analysis, pp. 14451, edited by M. Rosenblatt. New
York: Wiley.
PRIESTLEY, M. B. (1965). Evolutionary spectra and non-stationary processes. J. R. Statist. Soc. B 27,
204-37.
WHITTLE, P. (1963). On the fitting of multivariate autoregressions, and the approximate canonical
factorization of a spectral density matrix. Biometrika 50, 129-34.
WHITTLE, P. (1965). Recursive relations for predictors of non-stationary processes. J. R. Statist. Soc.
B 27, 523-32.