Spectral Analysis Principles and Tools

In this chapter

This chapter provides some guidance on spectral analysis, which will allow to understand the principle of various spectral analysis tools available in JDemetra+, via Graphical User Interface and R packages.

  • explanation of spectral graphs here, but description in GUI chap

  • outputs of tests ?

  • description of spectral graphs in GUI can be found here

Spectral analysis concepts

A time series xt with stationary covariance, mean μ and kth autocovariance E((xtμ)(xtkμ))=γ(k) can be described as a weighted sum of periodic trigonometric functions: sin(ωt) and cos(ωt), where ω=2piT denotes frequency. Spectral analysis investigates this frequency domain representation of xt to determine how important cycles of different frequencies are in accounting for the behaviour of xt.

Assuming that the autocovariances γ(k) are absolutely summable(k=|γ(k)|<), the autocovariance generating function, which summarizes these autocovariances through a scalar valued function, is given by Equation 1 (HAMILTON, J.D. (1994)).

(1)acgf(z)=k=zkγ(k)

where z denotes a complex scalar.

Once the Equation 1 is divided by π and evaluated at some z=eiω=cosωisinω, where i=1 and ω is a real scalar, < ω<, the result of this transformation is called a population spectrum f(ω)for  xt, given in Equation 2 (HAMILTON, J.D. (1994)).

(2)f(ω)=1πk=eikωγ(k)

Therefore, the analysis of the population spectrum in the frequency domain is equivalent to the examination of the autocovariance function in the time domain analysis; however it provides an alternative way of inspecting the process. Because f(ω) is interpreted as a contribution to the variance of components with frequencies in the range (ω, ω+dω), a peak in the spectrum indicates an important contribution to the variance at frequencies near the value that corresponds to this peak.

As eiω=cosωisinω, the spectrum can be also expressed as in Equation 3.

(3)f(ω)=1πk=(cosωkisinωk)γ(k)

Equation 3 can be presented as:

(4)f(ω)=1π[ γ(0)+2k=1 γ(k)cosωk]

This implies that if autocovariances are absolutely summable the population spectrum exists and is a continuous, real-valued function of ω. Due to the properties of trigonometric functions (cos(ωk)=cos(ωk) and cos(ω+2πj)k=cos(ωk)) the spectrum is a periodic, even function of ω, symmetric around ω=0. Therefore, the analysis of the spectrum can be reduced to the interval (π,π). The spectrum is non-negative for all ω(π,π).

The shortest cycle that can be distinguished in a time series lasts two periods. The frequency which corresponds to this cycle is ω=π and is called the Nyquist frequency. The frequency of the longest cycles that can be observed in the time series with n observations is ω=2πn and is called the fundamental (Fourier) frequency.

Note that if xt is a white noise process with zero mean and variance σ2, then for all |k|>0 γ(k)=0 and the spectrum of xt is constant (f(ω)=σ2π) since each frequency in the spectrum contributes equally to the variance of the process (BROCKWELL, P.J., and DAVIS, R.A. (2002)).

The aim of spectral analysis is to determine how important cycles of different frequencies are in accounting for the behaviour of a time series. Since spectral analysis can be used to detect the presence of periodic components, it is a natural diagnostic tool for detecting trading day effects as well as seasonal.

Among the tools used for spectral analysis are the autoregressive spectrum and the periodogram.

The explanations given in the subsections of this node derive mainly from DE ANTONIO, D., and PALATE, J. (2015) and BROCKWELL, P.J., and DAVIS, R.A. (2006).

Spectral density of an ARIMA model

Estimation

Method 1: Periodogram

For any given frequency ω the sample periodogram is the sample analog of the sample spectrum. In general, the periodogram is used to identify the periodic components of unknown frequency in the time series. X-13ARIMA-SEATS and TRAMO-SEATS use this tool for detecting seasonality in raw time series and seasonally adjusted series. Apart from this it is applied for checking randomness of the residuals from the ARIMA model.

To define a periodogram, first consider a vector of complex numbers

(5)x=[x1x2...xn]Cn

where Cn is the set of all column vectors with complex-valued components.

The Fourier frequencies associated with the sample size n are defined as a set of values ωj=2πjn, j=[n12],,[n2], π<ωjπ, jFn, where [n] denotes the largest integer less than or equal to n. The Fourier frequencies, which are called harmonics, are given by integer multiples of the fundamental frequency  2πn.

Now the n vectors ej=n12(eiωj,ei2ωj,,einωj) can be defined. Vectors e1,,en are orthonormal in the sense that:

(6)ejek=n1r=1neir(ωjωk)=1, if j=k0, if jk

where ej denotes the row vector, which kth component is the complex conjugate of the kth component of ej. These vectors are a basis of Fn, so that any xCn can be expressed as a sum of n components:

(7)x=j=[n12][n2]ajej

where the coefficients aj=ejx=n12t=1nxteitωj are derived from Equation 7 by multiplying the equation on the left by ej and using Equation 5.

The sequence of {aj,jFn} is referred as a discrete Fourier transform of xCn and the periodogram I(ωj) of x at Fourier frequency ωj=2πjn is defined as the square of the Fourier transform {aj} of x:

(8)I(ωj)=|aj|2 =n 1|t=1nxteitωj|2 

From Equation 6 and Equation 7 it can be shown that a periodogram decomposes the total sum of squares t=1n|xt|2 into a sum of components associated with the Fourier frequencies ωj:

(9)t=1n|xt|2=j=[n12][n2]|aj|2=j=[n12][n2]I(ωj)

If  x  Rn, ωj and ωj are both in [π,π] and aj is presented in its polar form (i.e. aj=rjexp(iθj)), where rj is the modulus of aj, then Equation 7 can be rewritten in the form:

(10)x=a0e0+j=1[n12]21/2rj(cjcosθjsjsinθj)+an/2en/2

The orthonormal basis for Rn is {e0,c1,s1,,c[n12],s[n12],en2(excluded if n is odd)}, where:

e0 is a vector composed of n elements equal to n1/2, which implies that a0e0=(n1t=1nxt,,n1t=1nxt);

cj=(n2)1/2(cosωj,cos2ωj,,cosnωj),for1j[(n1)2]

sj=(n2)1/2(sinωj,sin2ωj,,sinnωj), for 1j[(n1)2]

en/2=((n12),n12,,(n)12),n12)

Equation 9 can be seen as an OLS regression of xt on a constant and the trigonometric terms. As the vector of explanatory variables includes n elements, the number of explanatory variables in Equation 9 is equal to the number of observations. HAMILTON, J.D. (1994) shows that the explanatory variables are linearly independent, which implies that an OLS regression yields a perfect fit (i.e. without an error term). The coefficients have the form of a simple OLS projection of the data on the orthonormal basis:

(11)a^0=1nt=1nxt

(12)a^n/2=1nt=1n(1)txt(only when n is even)

(13)a^0=1nt=1nxt

(14)α^j=21/2rjcosθj=(n2)1/2t=1nxtcos(t2πjn),j=1,,[n12]

(15)β^j=21/2rjsinθj=(n2)1/2t=1nxtsin(t2πjn),j=1,,[n12]

With Equation 9 the total sum of squares t=1n|xt|2 can be decomposed into 2×[n12] components corresponding to cj and sj, which are grouped to produce the “frequency ωj” component for 1j[n12]. As it is shown in the table below, the value of the periodogram at the frequency ωj is the contribution of the jth harmonic to the total sum of squares t=1n|xt|2.

Decomposition of sum of squares into components corresponding to the harmonics

Frequency Degrees of freedom Sum of squares decomposition
ω0 (mean) 1 a02 =n1(t=1nxt)2=I(0)
ω1 2 2r12 =2|a1|2=2I(ω1)
ωk 2 2rk2 =2|ak|2=2I(ωk)
ωn/2=π (excluded if n is odd) 1 an/22=I(π)
Total n t=1nxt2

Source: DE ANTONIO, D., and PALATE, J. (2015).

Obviously, if series were random then each component I(ωj) would have the same expectation. On the contrary, when the series contains a systematic sine component having a frequency j and amplitude A then the sum of squares I(ωj) increases with A. In practice, it is unlikely that the frequency j of an unknown systematic sine component would exactly match any of the frequencies, for which periodogram have been calculated. Therefore, the periodogram would show an increase in intensities in the immediate vicinity of j. (BOX, G.E.P., JENKINS, G.M., and REINSEL, G.C. (2007).

Note that in JDemetra+ the periodogram object corresponds exactly to the contribution to the sum of squares of the standardised data, since the series are divided by their standard deviation for computational reasons.

Using the decomposition presented in table above the periodogram can be expressed as:

(16)I(ωj)=rj2=12(αj2+βj2)= 1n(t=1nxtcos(t2πjn ) )2+1n(t=1nxtsin(t2πjn) )2

where j=0,,[n2].

Since xx are generated by an orthonormal basis, and x=a0e0 Equation 9 can be rearranged to show that the sum of squares is equal to the sum of the squared coefficients:

(17)xa0e0=j=1[(n1)/2](αjcj+βjsj)+an/2en/2

Thus the sample variance of xt can be expressed as:

(18)n1t=1n(xtx)2=n1(k=1[(n1)/2]2rj2+an/22)

where an/22 is excluded if n is odd.

The term 2rj2 in Equation 18 is then the contribution of the jth harmonic to the variance and Equation 18 shows then how the total variance is partitioned.

The periodogram ordinate I(ωj) and the autocovariance coefficient γ(k) are both quadratic forms of xt. It can be shown that the periodogram and autocovariance function are related and the periodogram can be written in terms of the sample autocovariance function for any non-zero Fourier frequency ωj (The proof is given in BROCKWELL, P.J., and DAVIS, R.A. (2006)).

(19)I(ωj)=|k|<n γ^(k) eikωj=γ^(0) +2k=1n1γ^(k)cos(kωj) 

and for the zero frequency  I(0)=n|x|2.

Once comparing Equation 19 with an expression of the spectral density of a stationary process:

(20)f(ω )=12πk<γ(k) eikω =12π[γ(0) +2(k=1γ(k)cos(kω ))]

It can be noticed that a periodogram is a sample analogue of the population spectrum. In fact, it can be shown that the periodogram is asymptotically unbiased but inconsistent estimator of the population spectrum f(ω). Therefore, the periodogram is a wildly fluctuating, with high variance, estimate of the spectrum. However, the consistent estimator can be achieved by applying the different linear smoothing filters to the periodogram, called lag-window estimators. The lag-window estimators implemented in JDemetra+ includes square, Welch, Tukey, Barlett, Hanning and Parzen. They are described in DE ANTONIO, D., and PALATE, J. (2015). Alternatively, the model-based consistent estimation procedure, resulting in autoregressive spectrum estimator, can be applied.

Method 2: Autoregressive spectrum estimation

BROCKWELL, P.J., and DAVIS, R.A. (2006) point out that for any real-valued stationary process (xt) with continuous spectral density f(ω) it is possible to find both AR(p) and MA(q) processes which spectral densities are arbitrarily close to f(ω). For this reason, in some sense, (xt) can be approximated by either AR(p) or MA(q) process. This fact is a basis of one of the methods of achieving a consistent estimator of the spectrum, which is called an autoregressive spectrum estimation. It is based on the approximation of the stochastic process (xt) by an autoregressive process of sufficiently high order p:

(21)xt=μ+(ϕ1B++ϕpBp)xt+εt

where εt is a white-noise variable with mean zero and a constant variance.

The autoregressive spectrum estimator for the series xt is defined as (Definition from ‘X-12-ARIMA Reference Manual’ (2011)).

(22)s^(ω)=10×log10σx22π|1k=1pϕ^keikω|2

where:

  • ω– frequency, 0ωπ;

  • σx2 innovation variance of the sample residuals;

  • ϕ^kAR(k) coefficient estimates of the linear regression of xtx on xtkx, 1kp.

The autoregressive spectrum estimator is used in the visual spectral analysis tool for detecting significant peaks in the spectrum. The criterion of visual significance, implemented in JDemetra+, is based on the range s^maxs^min of the s^(ω) values, where s^max=maxks^(ωk); s^min=minks^(ωk); and s^(ωk) is kth value of autoregressive spectrum estimator.

A particular value is considered to be visually significant if, at a trading day or at a seasonal frequency ωk (other than the seasonal frequency ω60=π), s^(ωk) is above the median of the plotted values of s^(ωk) and is larger than both neighbouring values s^(ωk1) and s^(ωk+1) by at least 652 times the range s^maxs^min.

Following the suggestion of SOUKUP, R.J., and FINDLEY, D.F. (1999), JDemetra+ uses an autoregressive model spectral estimator of model order 30. This order yields high resolution of strong components, meaning peaks that are sharply defined in the plot of s^(ω) with 61 frequencies. The minimum number of observations needed to compute the spectrum is set to n=80 for monthly data and to n=60 for quarterly series while the maximum number of observations considered for the estimation is 121. Consequently, with these settings it is possible to identify up to 30 peaks in the plot of 61 frequencies. By choosing ωk=πk60 for k=0,1,...,60 the density estimates are calculated at exact seasonal frequencies (1, 2, 3, 4, 5 and 6 cycles per year).

The model order can also be selected based on the AIC criterion (in practice it is much lower than 30). A lower order produces the smoother spectrum, but the contrast between the spectral amplitudes at the trading day frequencies and neighbouring frequencies is weaker, and therefore not as suitable for automatic detection.

SOUKUP, R.J., and FINDLEY, D.F. (1999) also explain that the periodogram can be used in the visual significance test as it has as good as those of the AR(30) spectrum abilities to detect trading day effect, but also has a greater false alarm rate, which is defined as the fraction of the 50 replicates for which a visually significant spectral peak occurred at one of the trading day frequencies being considered in the designated output spectra (SOUKUP, R.J., and FINDLEY, D.F. (1999)).

Method 3: Tukey spectrum

The Tukey spectrum belongs to the class of lag-window estimators. A lag window estimator of the spectral density f(ω)=12πk<γ(k)eikω is defined as follows:

f^L(ω)=12π|h|rw(h/r)γ^(h)eihω

where γ^(.) is the sample autocovariance function, w(.) is the lag window, and r is the truncation lag. |w(x)| is always less than or equal to one, w(0)=1 and w(x)=0 for |x|>1. The simple idea behind this formula is to down-weight the autocovariance function for high lags where γ^(h) is more unreliable. This estimator requires choosing r as a function of the sample size such that r/narrow0 and r when narrow . These conditions guarantee that the estimator converges to the true density.

JDemetra+ implements the so-called Blackman-Tukey (or Tukey-Hanning) estimator, which is given by w(h/r)=0.5(1+cos(πh/r)) if |h/r|1 and 0 otherwise.

The choice of large truncation lags r decreases the bias, of course, but it also increases the variance of the spectral estimate and decreases the bandwidth.

JDemetra+ allows the user to modify all the parameters of this estimator, including the window function.

Identification of spectral peaks

The sections below describe the test, their practical implementation in the Graphical User interface can be found here

In order to decide whether a series has a seasonal component that is predictable (stable) enough, these tests use visual criteria and formal tests for the periodogram. The periodogram is calculated using complete years, so that the set of Fourier frequencies contains exactly all seasonal frequencies.

The tests rely on two basic principles:

  • The peaks associated with seasonal frequencies should be larger than the median spectrum for all frequencies and;

  • The peaks should exceed the spectrum of the two adjacent values by more than a critical value.

JDemetra+ performs this test on the original series. If these two requirements are met, the test results are displayed in green. The statistical significance of each of the seasonal peaks (i.e. frequencies π6, π3, π2, 2π3 and 5π6 corresponding to 1, 2, 3, 4 and 5 cycles per year) is also displayed. The seasonal and trading days frequencies depends on the frequency of time series. They are shown in the table below. The symbol d denotes a default frequency and is described below the table.

The seasonal and trading day frequencies by time series frequency

Number of months per period (year) Seasonal frequency Trading day frequency (radians)
12 π6,π3, π2,2π3, 5π6, π d,2.714
4 π2, π d, 1.292, 1.850, 2.128
3 π d
2 π d

The calendar (trading day or working day) effects, related to the variation in the number of different days of the week per period, can induce periodic patterns in the data that can be similar to those resulting from pure seasonal effects. From the theoretical point of view, trading day variability is mainly due to the fact that the average number of days in the months or quarters is not equal to a multiple of 7 (the average number of days of a month in the year of 365.25 days is equal to 365.2512=30.4375 days). This effect occurs 365.2512×17=4.3482 times per month: one time for each one of the four complete weeks of each month, and a residual of 0.3482 cycles per month, i.e. 0.3482×2π=2.1878 radians. This turns out to be a fundamental frequency for the effects associated with monthly data. In JDemetra+ the fundamental frequency corresponding to 0.3482 cycles per month is used in place of the closest frequency πk60. Thus, the quantity π×4260 is replaced by ω42=0.3482×2π=2.1878. The frequencies neighbouring ω42, i.e. ω41 and ω43 are set to, respectively, 2.1865160 and 2.1865+160.

The default frequencies (d) for calendar effect are: 2.188 (monthly series) and 0.280 (quarterly series). They are computed as:

(23)ωce=2π7(n7×[n7])

where n=365.25s, s=4 for quarterly series and s=12 for monthly series.

Other frequencies that correspond to trading day frequencies are: 2.714 (monthly series) and 1.292, 1.850, 2.128 (quarterly series).

In particular, the calendar frequency in monthly data (marked in red on the figure below) is very close to the seasonal frequency corresponding to 4 cycles per year ω40=23π=2.0944.

Periodogram with seasonal (grey) and calendar (red) frequencies highlighted

This implies that it may be hard to disentangle both effects using the frequency domain techniques.

In a Tukey spectrum

Current JDemetra+ implementation of the seasonality test is based on a F(d1,d2) approximation that has been originally proposed by Maravall (2012) for TRAMO-SEATS. This test is has been designed for a Blackman-Tukey window based on a particular choices of the truncation lag r and sample size. Following this approach, we determine visually significant peaks for a frequency ωj when

2fx(ωj)[fx(ωj+1)+fx(ωj1)]CV(ωj)

where CV(ωj) is the critical value of a F(d1,d2) distribution, where the degrees of freedom are determined using simulations. For ωj=π, we have a significant peak when fx(ω[n/2])[fx(ω[(n1)/2])]CV(ωj)

Two significant levels for this test are considered: α=0.05 (code “t”) and α=0.01 (code “T”).

As opposed to the AR spectrum test, which is computed on the basis of the last 120 data points, we will use here all available observations. Those critical values have been calculated given the recommended truncation lag r=79 for a sample size within the interval [80,119] and r=112 for n[120,300] . The F approximation is less accurate for sample sizes larger than 300. For quarterly data, r=44, but there are no recommendations regarding the required sample size.

Practical implementation in GUI is detailed here

In AR Spectrum definition

The estimator of the spectral density at frequency λ[0,π] will be given by the assumption that the series will follow an AR(p) process with large p. The spectral density of such model, with an innovation variance var(xt)=σx2, is expressed as follows:

10×log10fx(λ)=10×log10σx22π|ϕ(eiλ)|2=10×log10σx22π|1k=1pϕkeikλ)|2

where:

  • ϕk denotes the AR(k) coefficient ;

  • eikλ=cos(ikλ)+isin(ikλ).

Soukup and Findely (1999) suggest the use of p=30, which in practice much larger than the order that would result from the AIC criterion. The minimum number of observations needed to compute the spectrum is set to n=80 for monthly data (or n=60) for quarterly series. In turn, the maximum number of observations considered for the estimation is n=121. This choice offers enough resolution, being able to identify a maximum of 30 peaks in a plot of 61 frequencies: by choosing λj=πj/60,for j=0,1,,60, we are able to calculate our density estimates at exact seasonal frequencies (1, 2, 3, 4, 5 and 6 cycles per year). Note that x cycles per year can be converted into cycles per month by simply dividing by twelve, x/12, and to radians by applying the transformation 2π(x/12).

The traditional trading day frequency corresponding to 0.348 cycles per month is used in place of the closest frequency πj/60. Thus, we replace π42/60 by λ42=0.348×2π=2.1865. The frequencies neighbouring λ42 are set to λ41=2.18651/60 and λ43=2.1865+1/60. The periodogram below illustrates the proximity of this trading day frequency λ42 (red shade) and the frequency corresponding to 4 cycles per year λ40=2.0944. This proximity is precisely what poses the identification problems: the AR spectrum boils down to a smoothed version of the periodogram and the contribution of the of the trading day frequency may be obscured by the leakage resulting from the potential seasonal peak at λ40, and vice-versa.

Periodogram with seasonal (grey) and calendar (red) frequencies highlighted

JDemetra+ allows the user to modify the number of lags of this estimator and to change the number of observations used to determine the AR parameters. These two options can improve the resolution of this estimator.

The statistical significance of the peaks associated to a given frequency can be informally tested using a visual criterion, which has proved to perform well in simulation experiments. Visually significant peaks for a frequency λj satisfy both conditions:

  • fx(λj)max{fx(λj+1),fx(λj1)}[maxkfx(λk)minifx(λi)]CV(λj), where CV(λj) can be set equal to 6/52 for all j

  • fx(λj)>medianj{fx(λj)}, which guarantees fx(λj) it is not a local peak.

The first condition implies that if we divide the range maxkfx(λk)minifx(λi) in 52 parts (traditionally represented by stars) the height of each pick should be at least 6 stars.

Seasonal and trading day frequencies by time series frequency

Number of months per full period Seasonal frequency Trading day frequency (radians)
12 π6,π3, π2,2π3, 5π6, π d, 2.714
6 π3,2π3, π d
4 π2, π d, 1.292, 1.850, 2.128
3 π d
2 π d

Currently, only seasonal frequencies are tested, but the program allows you to manually plot the AR spectrum and focus your attention on both seasonal and trading day frequencies. Agustin Maravall has conducted a simulation experiment to calculate CV(λ42) (trading day frequency) and proposes to set for all j equal to the critical value associated to the trading frequency, but this is currently not part of the current automatic testing procedure of JDemetra+.

Practical implementation in GUI is detailed here

In a Periodogram

The periodogram I(ωj) of XCn is defined as the squared of the Fourier transform

I(ωj)=aj2=n1|t=1nXteitωj|2,

where the Fourier frequencies ωj are given by multiples of the fundamental frequency 2πn:

ωj=2πjn,π<ωjπ

An orthonormal basis in Rn:

{e0,      c1,s1,           ,    c[(n1)/2],s[(n1)/2]    ,      en/2},

where en/2 is excluded if n is odd,

can be used to project the data and obtain the spectral decomposition

Thus, the periodogram is given by the projection coefficients and represents the contribution of the jth harmonic to the total sum of squares, as illustrated by Brockwell and Davis (1991):

Source Degrees of freedom
Frequency ω0 1
Frequency ω1 2
Frequency ωk 2
Frequency ωn/2=π 1
(excluded if n is odd)
========= ======
Total n

    

In JDemetra+, the periodogram of XRn is computed for the standardized time series.

Defining a F-test

Brockwell and Davis (1991, section 10.2) exploit the fact that the periodogram can be expressed as the projection on the orthonormal basis defined above to derive a test. Thus, under the null hypothesis:

  • 2I(ωk)=Psp¯{ck,sk}X2σ2χ2(2), for Fourier frequencies 0<ωk=2πk/n<π
  • I(π)=Psp¯{en/2}X2σ2χ2(1), for π

Because I(ωk) is independent from the projection error sum of squares, we can define our F-test statistic as follows:

  • 2I(ωk)XPsp¯{e0,ck,sk}X2n32F(2,n3), for Fourier frequencies 0<ωk=2πk/n<π
  • I(π)XPsp¯{e0,en/2}X2n21F(1,n2), for π

where:

-XPsp¯{e0,ck,sk}X2=i=1nXi2I(0)2I(ωk)σ2χ2(n3) for Fourier frequencies 0<ωk=2πk/n<π

-XPsp¯{e0,en/2}X2=i=1nXi2I(0)I(π)σ2χ2(n2) for π

Thus, we reject the null if our F-test statistic computed at a given seasonal frequency (different from π) is larger than F1α(2,n3). If we consider π, our test statistic follows a F1α(1,n2) distribution.

The implementation of JDemetra+ considers simultaneously the whole set of seasonal frequencies (1, 2, 3, 4, 5 and 6 cycles per year). Thus, the resulting test-statistic is:

2I(π/6)+2I(π/3)+2I(2π/3)+2I(5π/6)+δI(π)XPsp¯{e0,c1,s1,c2,s2,c3,s3,c4,s4,c5,s5,δen/2}X2n1211F(11δ,n12+δ)

where δ=1 if n is even and 0 otherwise.

In small samples, the test performs better when the periodogram is evaluated as the exact seasonal frequencies. JDemetra+ modifies the sample size to ensure the seasonal frequencies belong to the set of Fourier frequencies. This strategy provides a very simple and effective way to eliminate the leakage problem.

Practical implementation in GUI is detailed here

Spectral graphs

The section below provides guidance on interpretation of spectral graphs, the display of which in the Graphical User Interface can be found here

The interpretation of the spectral graph is rather straightforward. When the values of a spectral graph for low frequencies (i.e. one year and more) are large in relation to its other values it means that the long-term movements dominate in the series. When the values of a spectral graph for high frequencies (i.e. below one year) are large in relation to its other values it means that the series are rather trendless and contains a lot of noise. When the values of a spectral graph are distributed randomly around a constant without any visible peaks, then it is highly probable that the series is a random process. The presence of seasonality in a time series is manifested in a spectral graph by the peaks on the seasonal frequencies.