barcodefontsoft.com

The Mathematical and Statistical Foundations of Econometrics in .NET Render Code 128C in .NET The Mathematical and Statistical Foundations of Econometrics barcode for visual C#

The Mathematical and Statistical Foundations of Econometrics generate, create none none in none projectsencode text string to barcode string in c# Theorem 7.1: (Wold decompos none for none ition) Let X t R be a zero-mean covariance stationary process. Then we can write X t = j Ut j + Wt , where j=0 0 = 1, 2 < , the Ut s are zero-mean covariance stationary and j=0 j uncorrelated random variables, and Wt is a deterministic process, that is, there exist coef cients j such that P[Wt = j Wt j ] = 1.

Moreover, j=1 Ut = X t j X t j and E[Ut+m Wt ] = 0 for all integers m and t. j=1 Intuitive proof: The exact proof employs Hilbert space theory and will therefore be given in the appendix to this chapter. However, the intuition behind the Wold decomposition is not too dif cult.

It is possible to nd a sequence j , j = 1, 2, 3, . . .

of real numbers such that E[(X t j X t j )2 ] is minimal. The random variable j=1 Xt =. Visual Studio Development Language j=1. j X t j (7.1). is then called the linear projection of X t on X t j , j 1. If we let Ut = X t j=1. j X t j ,. j=1. (7.2) j X t j )2 ]/ j none for none = 0 (7.3).

it follows from the rst-or der condition E[(X t that E[Ut X t m ] = 0 E[Ut ] = 0, for m = 1, 2, 3, . . .

.. Note that (7.2) and (7.3) imply E[Ut Ut m ] = 0 for m = 1, 2, 3, . . .

.. (7.4). Moreover, note that by (7.2) and (7.3), E X t2 = E Ut + j X t j j=1. ,. = E Ut2 + E j X t j and thus by the covariance stationarity of X t ,. 2 E Ut2 = u E X t2 (7.5). t E X2 = E j=1. = 2 E X t2 X (7.6). j X t j Dependent Laws of Large Numbers and Central Limit Theorems for all t. Hence it follows from (7.4) and (7.

5) that Ut is a zero-mean covariance stationary time series process itself. Next, substitute X t 1 = Ut 1 + j X t 1 j in (7.1).

Then (7.1) j=1 becomes X t = 1 Ut 1 + = 1 Ut 1 +. j=2 2 = 1 Ut 1 + 2 + 1 X t 2 + j=1 j X t 1 j j=2. j X t j ( j + 1 j 1 )X t j j=3. ( j + 1 j 1 )X t j . (7.7). Now replace X t 2 in (7.7) by Ut 2 + 2 X t = 1 Ut 1 + 2 + 1 j=1 . j X t 2 j . Then (7.7) becomes + j=3. Ut 2 + j X t 2 j ( j + 1 j 1 )X t j j=1 2 = 1 Ut 1 + 2 + 1 Ut 2 + 2 = 1 Ut 1 + 2 + 1 Ut 2 + j=3 2 2 + 1 j 2 + ( j + 1 j 1 ) X t j 2 2 + 1 1 + ( 3 + 1 2 ) X t 3 j=4. 2 2 + 1 j 2 + ( j + 1 j 1 ) X t j . Repeating this substitution m times yields an expression of the type Xt = j Ut j + j=m+1 m, j X t j ,. (7.8). for instance. It follows now from (7.3), (7.4), (7.5), and (7.8) that m 2 t E X 2 = u j=1 2 + E j m, j X t j j=m+1 Hence, letting m , we have 2 t E X 2 = u j=1 2 + lim E j j=m+1 = 2 < . X m, j X t j Therefore, we can write X t as Xt = j=0. j Ut j + Wt ,. (7.9). The Mathematical and Statistical Foundations of Econometrics j=m+1 m, j X t j where 0 = 1 and 2 < none none with Wt = plimm j=0 j mainder term that satis es E[Ut+m Wt ] = 0 for all integers m and t.. a re-. (7.10). Finally, observe from (7.2) and (7.9) that U t Wt j=1. j Wt j = (X t Wt ) =. j=0. j=1. j (X t j Wt j ). m=1. j Ut j j=1. m Ut j m = Ut + j Ut j , for instance. It follows now straightforw none for none ardly from (7.4), (7.5), and (7.

10) that j = 0 for all j 1; hence, Wt = j Wt j with probability 1. Q.E.

D. j=1 Theorem 7.1 carries over to vector-valued covariance stationary processes: Theorem 7.

2: (Multivariate Wold decomposition) Let X t Rk be a zero-mean covariance stationary process. Then we can write X t = A j Ut j + Wt , j=0 where A0 = Ik , A j AT is nite, the Ut s are zero-mean covariance staj j=0 T tionary and uncorrelated random vectors (i.e.

, E[Ut Ut m ] = O for m 1), and Wt is a deterministic process (i.e., there exist matrices B j such that P[Wt = B j Wt j ] = 1).

Moreover, Ut = X t B j X t j , and j=1 j=1 E[Ut+m WtT ] = O for all integers m and t. Although the process Wt is deterministic in the sense that it is perfectly predictable from its past values, it still may be random. If so, let tW = (Wt , Wt 1 , Wt 2 , .

. .) be the -algebra generated by Wt m for m 0.

Then all Wt s are measurable t m for arbitrary natural numbers m; hence, W all Wt s are measurable = t . However, it follows from (7.2) t=0 W W and (7.

9) that each Wt can be constructed from X t j for j 0; hence, tX = (X t , X t 1 , X t 2 , . . .

) tW , and consequently, all Wt s are measurable = t . This implies that Wt = E[Wt . ]. See 3. t=0 X X X Th e -algebra represents the information contained in the remote past X of X t .

Therefore, is called the remote -algebra, and the events therein are X called the remote events. If is the trivial -algebra { , }, and thus the X remote past of X t is uninformative, then E[Wt . ] = E[Wt ]; hence, Wt = none for none 0. X However, the same result holds if all the remote events have either probability 0 or 1, as is easy to verify from the de nition of conditional expectations with respect to a -algebra. This condition follows automatically from Kolmogorov s zero-one law if the X t s are independent (see Theorem 7.

5 below), but for dependent processes this is not guaranteed. Nevertheless, for economic time.
Copyright © barcodefontsoft.com . All rights reserved.