User:Egm6341.s11.team4.hylon/hw5

=Problem 5.2 Run Linear State Space Model without Noise and with Gaussian Random Noise and Cauchy Random Noise respectively=

Problem Statement
Refer to lecture notes [[media:nm1.s11.mtg25.djvu|25-2]] and [[media:nm1.s11.mtg25.djvu|26-1]] for detailed description

Given
Linear State Space Model without Random Noise (LSSM):
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

{{x}_{k+1}}=F{{x}_{k}}

$$ Linear State Space Model without Random Noise (LSSMRN):
 * }
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

{{x}_{k+1}}=F{{x}_{k}}+G{{w}_{k+1}}.

$$ For this specific case here, we choose
 * }
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

{{F}_{n\times n}}={{I}_{n\times n}}+\Delta {{A}_{n\times n}},n=2

$$ And $$I=\left( \begin{matrix}  1 & 0  \\   0 & 1  \\ \end{matrix} \right)$$, $$A=\left( \begin{matrix}   -0.2 & 1  \\   -1 & -0.2  \\ \end{matrix} \right)$$, $$\Delta =0.02$$, $${{x}_{0}}=\left( \begin{matrix}   3  \\   -2  \\ \end{matrix} \right)$$
 * }

Objective
1). Run LSSM and plot $${{x}_{j}},j=0,1,2,\cdots $$ in space $$({{x}^{1}},{{x}^{2}})$$. 2). Find equilibrium point as
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

\underset{k\to \infty }{\mathop{\lim }}\,{{x}_{k+1}}=\underset{k\to \infty }{\mathop{\lim }}\,{{F}^{k+1}}{{x}_{0}}=_ – ^ – :_ – ^ – \hat{x}

$$ Plot $$\hat{x}$$ as big red dot and $${{x}_{0}}$$ as big blue dot in the same plot for $${{x}_{j}},j=1,2,\cdots $$ small dots. 3). Let $$G=\left( \begin{matrix} 1 \\   1  \\ \end{matrix} \right)\alpha $$, thus $${{w}_{k+1}}=({{w}_{k+1}})$$. Use Matlab randn to generate $$({{w}_{j}},j=0,1,2,\cdots )$$. Plot $${{x}_{j}},j=0,1,2,\cdots $$for $$\alpha =0.5,1,2$$. 4). Same as 3). but with Cauchy random noise. Hint: Find a Matlab command to generate $${{\theta }_{j}},j=0,1,2,\cdots $$in single slit diffraction.
 * }.

2).Find equilibrium point and run LSSM

 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:92%; padding:10px; border:2px solid #ff0000" |
 * style="width:92%; padding:10px; border:2px solid #ff0000" |

\hat{x}=\underset{k\to \infty }{\mathop{\lim }}\,{{F}^{k+1}}{{x}_{0}}=\underset{k\to \infty }{\mathop{\lim }}\,{{\left( \begin{matrix}  0.996 & 0.02  \\   -0.02 & 0.996  \\ \end{matrix} \right)}^{k+1}}\left( \begin{matrix}   3  \\   -2  \\ \end{matrix} \right)=\left( \begin{matrix}   0  \\   0  \\ \end{matrix} \right)

$$
 * }

4). Run LSSM with Cauchy Random Noise


=Problem 5.8 Proof identities=

Given
Refer to Lecture notes [[media:nm1.s11.mtg30.djvu|30-2]] and [[media:nm1.s11.mtg30.djvu|30-3]] for detailed description.

Objective
Show that


 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

E_{n}^{T}=\frac{h}{2}\sum\limits_{k=0}^{n-1}{[\int\limits_{-1}^{+1}{{{g}_{k}}(t)dt}-\{{{g}_{k}}(-1)+{{g}_{k}}(+1)\}]}

$$
 * }


 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

\begin{align} & \int\limits_{-1}^{+1}{(-t)g_{k}^{(-1)}(t)dt}=\int\limits_{-1}^{+1}{{{g}_{k}}(t)dt}-[{{g}_{k}}(-1)+{{g}_{k}}(+1)] \end{align}

$$
 * }


 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

g_{k}^{(i)}(t)={{(\frac{h}{2})}^{i}}{{f}^{(i)}}(x(t))

$$
 * }

1).Function transformation

 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

\begin{align} & E_{n}^{T}=I-{{I}_{n}}=\int\limits_{a}^{b}{f(x)dx}-{{T}_{0}}(n) \\ & =\sum\limits_{k=0}^{n-1}{[\int\limits_^{f(x)dx}-\frac{h}{2}\{f({{x}_{k}})+f({{x}_{k+1}})\}]} \\ \end{align}

$$ Where $${{x}_{k}}:=a+kh,_ – ^ – h=(b-a)/n$$. Now, we transfer $$\int\limits_^{f(x)dx}$$ to $$\int\limits_{-1}^{+1}{f(x(t))dt}$$.
 * }
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

\begin{align} & x(t)=\frac{{{x}_{k+1}}-{{x}_{k}}}{2}t+\frac{{{x}_{k+1}}+{{x}_{k}}}{2} \\ & =\frac{h}{2}t+\frac{{{x}_{k+1}}+{{x}_{k}}}{2} \\ \end{align}

$$ From above transformation relationship, it’s obviously that
 * }
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

\left\{ \begin{matrix} x(-1)={{x}_{k}} \\ x(0)=\frac{{{x}_{k+1}}+{{x}_{k}}}{2} \\ x(+1)={{x}_{k+1}} \\ dx=\frac{h}{2}dt \\ \end{matrix} \right.

$$ Hence we have:
 * }.
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:95%" |
 * style="width:95%" |

\begin{align} & E_{n}^{T}=\sum\limits_{k=0}^{n-1}{[\int\limits_{-1}^{+1}{f(x(t))\frac{h}{2}dt}-\frac{h}{2}\{f(x(-1))+f(x(+1))\}]} \\ & =\frac{h}{2}\sum\limits_{k=0}^{n-1}{[\int\limits_{-1}^{+1}{f(x(t))dt}-\{f(x(-1))+f(x(+1))\}]} \\ \end{align}

$$ If we define $${{g}_{k}}(t)=f(x(t))_ – ^ – such_ – ^ – that_ – ^ – x\in [{{x}_{k}},{{x}_{k+1}}]$$, then we have
 * }
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:92%; padding:10px; border:2px solid #ff0000" |
 * style="width:92%; padding:10px; border:2px solid #ff0000" |

E_{n}^{T}=\frac{h}{2}\sum\limits_{k=0}^{n-1}{[\int\limits_{-1}^{+1}{{{g}_{k}}(t)dt}-\{{{g}_{k}}(-1)+{{g}_{k}}(+1)\}]}

$$
 * }

2).Integrate by part
Integrating by parts, we have
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:92%; padding:10px; border:2px solid #ff0000" |
 * style="width:92%; padding:10px; border:2px solid #ff0000" |

\begin{align} & \int\limits_{-1}^{+1}{(-t)g_{k}^{(-1)}(t)dt}=[(-t){{g}_{k}}(t)]_{x=-1}^{x=1}-\int\limits_{-1}^{+1}{{{g}_{k}}(t)d(-t)} \\ & =\int\limits_{-1}^{+1}{{{g}_{k}}(t)dt}-[{{g}_{k}}(-1)+{{g}_{k}}(+1)] \\ \end{align}

$$
 * }

3).Successively differentiation
Since we have already defined that $${{g}_{k}}(t)=f(x(t))_ – ^ – such_ – ^ – that_ – ^ – x\in [{{x}_{k}},{{x}_{k+1}}]$$ and from transformation relationship we have $$dx=\frac{h}{2}dt$$, successively differentiate function $${{g}_{k}}(t)$$ with respect to $$t$$, we can get
 * {| style="width:100%" border="0"

$$\displaystyle
 * style="width:92%; padding:10px; border:2px solid #ff0000" |
 * style="width:92%; padding:10px; border:2px solid #ff0000" |

g_{k}^{(i)}(t)={{f}^{(i)}}(x(t))\cdot {{x}^{(i)}}(t)={{(\frac{h}{2})}^{i}}{{f}^{(i)}}(x(t))

$$
 * }