Home Page > > Details

Help With PSTAT 174,Help With Java/Python Programming

PSTAT 174/274: Homework # 1.
This homework is based on Lectures 1–2. Please study material of week 1 before starting working on this
problems. Good Luck!
1. Understanding deterministic and stochastic trends. You are given the following statements about time
series:
I. Stochastic trends are characterized by explainable changes in direction.
II. Deterministic trends are better suited to extrapolation than stochastic trends.
III. Deterministic trends are typically attributed to high serial correlation with random error.
Determine which statements are false. Explain.
A. I only B. II only C. III only D. I, II, and III
E. The answer is not given by (A), (B), (C), or (D).
2. Random walk and stationarity. In this question we introduce random walk with non-zero mean.
A random walk is expressed as X1 = Z1, Xt = Xt−1 + Zt
, t = 2, 3, . . . , where Zt ∼ W N(µZ, σ2
Z
), that is,
E(Zt) = µZ, V ar(Zt) = σ
2
Z
, and Cov(Zt
, Zs) = 0 for t ̸= s. Determine which statements are true with
respect to a random walk model; show calculations and provide complete explanations.
I. If µZ ̸= 0, then the random walk is nonstationary in the mean.
(Hint: Nonstationary in the mean means that the mean changes with time.)
II. If σ
2
Z = 0, then the random walk is nonstationary in the variance.
(Hint: Nonstationary in the variance means that the variance changes with time.)
III. If σ
2
Z > 0, then the random walk is nonstationary in the variance.
3. Calculation of sample acf. You are given the following stock prices of company CAS:
Day Stock Price
1 538
2 548
3 528
4 608
5 598
6 589
7 548
8 514
9 501
10 498
Calculate the sample autocorrelation at lag 3.
Hints:
(i) We are given a sample of size n = 10 to estimate autocorrelation at lag 3: ρ(3) = Cor(X1, X4) = γ(3)
γ(0) ,
– for definition of autocorrelation at lag 3 see Week 1 slide 52 or (2.1.3) on p. 6 of Lecture Notes.
(ii) General formulas for calculating sample mean and covariance are given on slide 38 of week 1 and in
§1.2 on p. 4 of Lecture notes for week 1. To estimate ρ(3) = Cor(X1, X4) we have:
4. Polyroot command in R. Recall from algebra, that a function f(z) = anz
n + an−1z
n−1 + . . . + a1z + a0
is called a polynomial function of order n. Roots of a polynomial function f are solutions of the equation
f(z) = 0. Roots of a quadratic equation ax2 + bx + c = 0 are given by the formula x1,2 =
−b±

b
2−4ac
2a
.
Let f(z) = 1 − 2z and g(z) = 1 − 0.45z + 0.05z
2
. Find their roots, show calculations. Check your answers
using R command polyroot:
> polyroot(c(1, −2))
> polyroot(c(1, −0.45, 0.05)). (Do not forget to include your output!)
5. Model identification. You are given the following information about a MA(1) model with coefficient
|θ1| < 1: ρ1 = −0.4, ρk = 0, k = 2, 3, . . .. Determine the value of θ1.
6. Gaussian White Noise and its square. Let {Zt} be a Gaussian white noise, that is, a sequence of i.i.d.
normal r.v.s each with mean zero and variance 1. Let Yt = Z
2
t
.
(a) Using R generate 300 observations of the Gaussian white noise Z. Plot the series and its acf.
(b) Using R, plot 300 observations of the series Y = Z
2
t
. Plot its acf.
(c) Analyze graphs from (a) and (b).
– Can you see a difference between the plots of graphs of time series Z and Y ? From the graphs, would
you conclude that both series are stationary (or not)?
– Is there a noticeable difference in the plots of acf functions ρZ and ρY ? Would you describe Y as a
non-Gaussian white noise sequence based on your plots?
Provide full analysis of your conclusions.
(d) Calculate the second-order moments of Y: µY (t) = E(Yt), σ
2
Y
(t) = V ar(Yt), and
ρY (t, t + h) = Cor(Yt
, Yt+h). Do your calculations support your observations in (c)?
Hints: (i) Slides 65 and 68 of week 1 have R commands to generate MA(1) time series. White Noise is a
MA(1) process with coefficient θ1 = 0. Here is a more direct code to generate WN {Zt} ∼ N(0, 1) :
Z <= rnorm ( 3 0 0 )
pl o t . t s (Z , xl ab = ” ” , yl ab = ” ” )
a c f (Z , main = ”ACF” )
(ii) Useful for part (d): For X ∼ N(0, σ2
), E(X4
) = 3(σ
2
)
2
.
The following two problems are for students enrolled in PSTAT 274 ONLY
G1. Let {Zt} be Gaussian white noise, i.e. {Zt} is a sequence of i.i.d. normal r.v.s each with mean zero
and variance 1. Define
Xt =
(
Zt
, if t is even;
(Z
2
t−1 − 1)/

2, if t is odd
Show that {Xt} is WN(0, 1) (that is, variables Xt and Xt+k, k ≥ 1, are uncorrelated with mean zero and
variance 1) but that Xt and Xt−1 are not i.i.d.
G2. If {Xt} and {Yt} are uncorrelated stationary sequences, i.e., if Xr and Ys are uncorrelated for every r
and s, show that {Xt+Yt} is stationary with autocovariance function equal to the sum of the autocovariance
functions of {Xt} and {Yt}.

Contact Us - Email:99515681@qq.com    WeChat:codinghelp
Programming Assignment Help!