The regression-based approach to inference of a conditional independence

graph G in a Gaussian graphical model is motivated by the optimization problem

in (7.37). Here we derive the solution to that optimization problem in the case of

three variables.

Let (X,Y,Z)T be a Gaussian random vector with zero mean and covariance matrix

Σ, where Σ is defined as in Exercise 7.2. Consider the task of optimally predicting

Z as a linear combination of X and Y, using mean-squared error as a cost function.

That is, consider the optimization problem

min

βZX ,βZY

E

(Z −βZX X −βZYY)

2

. (7.62)

a. Show that under our model assumptions the expectation in (7.62) takes the form

1+β2

ZX +β2

ZY −2βZX ρZX −2βZY ρZY +2βZX βZY ρXY .

b. Differentiating the expression in part (a), with respect to βZX and βZY and setting

the resulting expressions equal to zero, show that the vector (βZX ,βZY )T must

satisfy the linear system of equations

Solution.pdf

Submit your documents and get free Plagiarism report

Your solution is just a click away! Get it Now

By creating an account, you agree to our terms & conditions

We don't post anything without your permission

Attach Files

Get it solved from our top experts within 48hrs!