Tải bản đầy đủ (.pdf) (18 trang)

advanced engineering mathematics – mathematics

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (61.14 KB, 18 trang )

<span class='text_page_counter'>(1)</span><div class='page_container' data-page=1>

<b>Introduction to Time Series Analysis. Lecture 20.</b>



1. Review: The periodogram


</div>
<span class='text_page_counter'>(2)</span><div class='page_container' data-page=2>

<b>Review: Periodogram</b>



The periodogram is defined as


I(ν) = <sub>|</sub>X(ν)<sub>|</sub>2
= 1
n





n
X
t=1


e−2πitνx


t





2


= X<sub>c</sub>2(ν) + X<sub>s</sub>2(ν).


X<sub>c</sub>(ν) = <sub>√</sub>1


n
n


X


t=1


cos(2πtν)x<sub>t</sub>,
X<sub>s</sub>(ν) = <sub>√</sub>1


n
n


X


</div>
<span class='text_page_counter'>(3)</span><div class='page_container' data-page=3>

<b>Asymptotic properties of the periodogram</b>



We want to understand the asymptotic behavior of the periodogram I(ν) at
a particular frequency ν, as n increases. We’ll see that its expectation


converges to f(ν).


We’ll start with a simple example: Suppose that X<sub>1</sub>, . . . , X<sub>n</sub> are
i.i.d. N(0, σ2) (Gaussian white noise). From the definitions,


Xc(νj) =


1




n
n


X


t=1


cos(2πtνj)xt, Xs(νj) =


1


n
n


X


t=1


sin(2πtνj)xt,


</div>
<span class='text_page_counter'>(4)</span><div class='page_container' data-page=4>

<b>Asymptotic properties of the periodogram</b>



Also,


Var(X<sub>c</sub>(ν<sub>j</sub>)) = σ


2


n


n


X


t=1


cos2(2πtν<sub>j</sub>)
= σ


2


2n
n


X


t=1


(cos(4πtνj) + 1) =
σ2


2 .


</div>
<span class='text_page_counter'>(5)</span><div class='page_container' data-page=5>

<b>Asymptotic properties of the periodogram</b>



Also,


Cov(X<sub>c</sub>(ν<sub>j</sub>), X<sub>s</sub>(ν<sub>j</sub>)) = σ



2
n


n


X


t=1


cos(2πtν<sub>j</sub>) sin(2πtν<sub>j</sub>)
= σ


2


2n
n


X


t=1


sin(4πtν<sub>j</sub>) = 0,


Cov(X<sub>c</sub>(ν<sub>j</sub>), X<sub>c</sub>(ν<sub>k</sub>)) = 0


Cov(X<sub>s</sub>(ν<sub>j</sub>), X<sub>s</sub>(ν<sub>k</sub>)) = 0


</div>
<span class='text_page_counter'>(6)</span><div class='page_container' data-page=6>

<b>Asymptotic properties of the periodogram</b>




That is, if X<sub>1</sub>, . . . , X<sub>n</sub> are i.i.d. N(0, σ2)


(Gaussian white noise; f(ν) = σ2), then the X<sub>c</sub>(ν<sub>j</sub>) and X<sub>s</sub>(ν<sub>j</sub>) are all
i.i.d. N(0, σ2/2). Thus,


2


σ2 I(νj) =


2


σ2 X
2


c(νj) + Xs2(νj)




∼ χ2<sub>2</sub>.


So for the case of Gaussian white noise, the periodogram has a chi-squared
distribution that depends on the variance σ2 (which, in this case, is the


</div>
<span class='text_page_counter'>(7)</span><div class='page_container' data-page=7>

<b>Asymptotic properties of the periodogram</b>



Under more general conditions (e.g., normal <sub>{</sub>Xt}, or linear process {Xt}


with rapidly decaying ACF), the Xc(νj), Xs(νj) are all asymptotically


independent and N(0, f(νj)/2).



Consider a frequency ν. For a given value of n, let νˆ(n) be the closest
Fourier frequency (that is, νˆ(n) = j/n for a value of j that minimizes


|ν <sub>−</sub> j/n<sub>|</sub>). As n increases, νˆ(n) <sub>→</sub> ν, and (under the same conditions that
ensure the asymptotic normality and independence of the sine/cosine


transforms), f(ˆν(n)) <sub>→</sub> f(ν). (picture)


In that case, we have


2


f(ν)I(ˆν


(n)<sub>) =</sub> 2
f(ν)




</div>
<span class='text_page_counter'>(8)</span><div class='page_container' data-page=8>

<b>Asymptotic properties of the periodogram</b>



Thus,


EI(ˆν(n)) = f(ν)


2 E



2



f(ν)


X<sub>c</sub>2(ˆν(n)) + X<sub>s</sub>2(ˆν(n))


→ f(<sub>2</sub>ν)E(Z<sub>1</sub>2 + Z<sub>2</sub>2) = f(ν),


</div>
<span class='text_page_counter'>(9)</span><div class='page_container' data-page=9>

<b>Asymptotic properties of the periodogram</b>



Since we know its asymptotic distribution (chi-squared), we can compute
approximate confidence intervals:


Pr


2


f(ν)I(ˆν


(n)<sub>)</sub> <sub>> χ</sub>2
2(α)




→ α,


where the cdf of a χ2<sub>2</sub> at χ2<sub>2</sub>(α) is 1 <sub>−</sub> α. Thus,



Pr


2I(ˆν(n))


χ2<sub>2</sub>(α/2) ≤ f(ν) ≤


2I(ˆν(n))


χ2<sub>2</sub>(1 <sub>−</sub> α/2)


</div>
<span class='text_page_counter'>(10)</span><div class='page_container' data-page=10>

<b>Asymptotic properties of the periodogram: Consistency</b>



Unfortunately, Var(I(ˆν(n))) → f(ν)2Var(Z<sub>1</sub>2 + Z<sub>2</sub>2)/4, where Z<sub>1</sub>, Z<sub>2</sub> are
i.i.d. N(0,1), that is, the variance approaches a constant.


Thus, I(ˆν(n)) is not a consistent estimator of f(ν). In particular, if


f(ν) > 0, then for ǫ > 0, as n increases,


Pr n
I(ˆν


(n)<sub>)</sub>


− f(ν)


> ǫ
o



</div>
<span class='text_page_counter'>(11)</span><div class='page_container' data-page=11>

<b>Asymptotic properties of the periodogram: Consistency</b>



This means that the approximate confidence intervals we obtain are
typically wide.


The source of the difficulty is that, as n increases, we have additional data
(the n values of x<sub>t</sub>), but we use it to estimate additional independent


random variables, (the n independent values of X<sub>c</sub>(ν<sub>j</sub>), X<sub>s</sub>(ν<sub>j</sub>)).
How can we reduce the variance? The typical approach is to average


independent observations. In this case, we can take an average of “nearby”
values of the periodogram, and hope that the spectral density at the


</div>
<span class='text_page_counter'>(12)</span><div class='page_container' data-page=12>

<b>Introduction to Time Series Analysis. Lecture 20.</b>



1. Review: The periodogram


</div>
<span class='text_page_counter'>(13)</span><div class='page_container' data-page=13>

<b>Nonparametric spectral estimation</b>



Define a band of frequencies




νk −
L


2n, νk +
L



2n




of bandwidth L/n. Suppose that f(ν) is approximately constant in this
frequency band.


<i>Consider the following smoothed spectral estimator.</i> (assume <sub>L</sub> is odd)


ˆ


f(ν<sub>k</sub>) = 1


L


(L<sub>−</sub>1)/2


X


l=<sub>−</sub>(L<sub>−</sub>1)/2


I(ν<sub>k</sub> <sub>−</sub> l/n)


= 1


L


(L<sub>−</sub>1)/2



X


l= (L 1)/2


</div>
<span class='text_page_counter'>(14)</span><div class='page_container' data-page=14>

<b>Nonparametric spectral estimation</b>



For a suitable time series (e.g., Gaussian, or a linear process with


sufficiently rapidly decreasing autocovariance), we know that, for large n,
all of the X<sub>c</sub>(ν<sub>k</sub> <sub>−</sub> l/n) and X<sub>s</sub>(ν<sub>k</sub> <sub>−</sub> l/n) are approximately independent
and normal, with mean zero and variance f(ν<sub>k</sub> <sub>−</sub> l/n)/2. From the


assumption that f(ν) is approximately constant across all of these
frequencies, we have that, asymptotically,


ˆ


f(ν<sub>k</sub>) <sub>∼</sub> f(ν<sub>k</sub>)χ


2
2L


</div>
<span class='text_page_counter'>(15)</span><div class='page_container' data-page=15>

<b>Nonparametric spectral estimation</b>



Thus,


Efˆ(ˆν(n)) ≈ f(ν)


2L E



2L


X


i=1
Z<sub>i</sub>2


!


= f(ν),


Varfˆ(ˆν(n)) <sub>≈</sub> f


2<sub>(</sub><sub>ν</sub><sub>)</sub>


4L2 Var


2L


X


i=1
Z<sub>i</sub>2


!


= f


2<sub>(</sub><sub>ν</sub><sub>)</sub>



2L Var(Z
2
1),


</div>
<span class='text_page_counter'>(16)</span><div class='page_container' data-page=16>

<b>Nonparametric spectral estimation: confidence intervals</b>



From the asymptotic distribution, we can define approximate confidence
intervals as before:


Pr
(


2Lfˆ(ˆν(n))


χ2<sub>2L</sub>(α/2) ≤ f(ν) ≤


2Lfˆ(ˆν(n))


χ2<sub>2L</sub>(1 <sub>−</sub> α/2)
)


≈ 1 <sub>−</sub> α.


</div>
<span class='text_page_counter'>(17)</span><div class='page_container' data-page=17>

<b>Nonparametric spectral estimation</b>



<i>Notice the bias-variance trade off:</i>


For bandwidth B = L/n, we have Varfˆ(ν<sub>k</sub>) <sub>≈</sub> c/(Bn) for some constant c.
So we want a bigger bandwidth B <i>to ensure low variance (bandwidth</i>



<i>stability).</i>


But the larger the bandwidth, the more questionable the assumption that


</div>
<span class='text_page_counter'>(18)</span><div class='page_container' data-page=18>

<b>Nonparametric spectral estimation: confidence intervals</b>



Since the asymptotic mean and variance of fˆ(ˆν(n)) are proportional to f(ν)


and f2(ν)<i>, it is natural to consider the logarithm of the estimator. Then we</i>
can define approximate confidence intervals as before:


Pr
(


2Lfˆ(ˆν(n))


χ2<sub>2L</sub>(α/2) ≤ f(ν) ≤


2Lfˆ(ˆν(n))


χ2<sub>2L</sub>(1 <sub>−</sub> α/2)
)


≈ 1 <sub>−</sub> α,


Pr


logfˆ(ˆν(n)) + log



2L
χ2<sub>2L</sub>(α/2)




≤ log(f(ν)) <sub>≤</sub> log fˆ(ˆν(n)) + log


2L


χ2<sub>2L</sub>(1 <sub>−</sub> α/2)


≈ 1 <sub>−</sub> α.


</div>

<!--links-->

×