User:Egm6322.s09.Three.ge/MyReport2

=PDE's linear with respect to 2nd derivative, but still not linear in general= L7 $$div[\bar{\kappa}\cdot grad\, u]+f(x,y,u,u_{x},u_{y})=0$$

$$\bar{\kappa}=\bar{\kappa}(x,y)$$

e.g.

$$div[\bar{\kappa}(x,y)\cdot grad\, u]+ ax^{2}+ by + \sqrt{u}+ (u_{x})^{4}+2(u_{y})^{2}=0$$

HW:Show PDE is linear with respect to 2nd order derivative, nonlinear as a whole.

if

$$D_{1}(\cdot)=div[\bar{\kappa}(x,y)\cdot grad\, u]$$ linear

$$D_{2}(\cdot):=D_{1}+(\cdot)^{\frac{1}{2}}+[(\cdot)_{x}]^{4}+[(\cdot)_{y}]^{2}$$ Nonlinear

$$D_{3}:=D_{2}+ax^{2}+by$$ Nonlinear

=PDE:Particular to General= L8 2nd order PDE's, linear in all orders. i.e. linear with respect to 1st order & zeroeth order.

General form:

$$au_{xx}+2bu_{xy}+cu_{yy}+du_{x}+eu_{y}+fu+g=0...(1)$$

coefficients can be a function of x,y, or other constants in general if they are a function of u -> quasilinear.

(Picture of domain space again.) HW:Show that (1) is 2nd order and linear in all derivative orders.

{}=column matrix

$$ \begin{pmatrix} \partial _{x} & \partial_{y} \end{pmatrix} \begin{pmatrix} a & b\\ c & d \end{pmatrix}\begin{Bmatrix} \partial_{x}u\\ \partial_{y}u \end{Bmatrix}+\begin{pmatrix} d & e \end{pmatrix}\begin{Bmatrix} \partial_{x}u\\ \partial_{y}u \end{Bmatrix}+fu+g=0...(2)$$

let $$au_{xx}+2bu_{xy}+cu_{yy}=\alpha$$

$$\partial_{x}=\frac{\partial}{\partial x}(\cdot)$$ $$\partial_{y}=\frac{\partial}{\partial y}(\cdot) $$

$$\begin{bmatrix} \partial_{x} & \partial_{y} \end{bmatrix}\begin{Bmatrix} au_{x}+bu_{y}\\ bu_{x}+cu_{y} \end{Bmatrix}=\begin{Bmatrix} (au_{x})_{x}+(bu_{y})_{x}\\ (bu_{x})_{y}+(cu_{y})_{y} \end{Bmatrix}$$

Assuming a,b,c are constants, uxy=uyx.

$$[a_{x}u_{x}]+[b_{x}u_{y}]+[b_{y}u_{x}]+[...] $$

HW: fill in the rest. Find condition where uxy=uyx (continuous & smooth include proof. Rigorous)

Eqn (2)=> Eqn (1) only if a,b,c are constant, but d,e,f,g may be functions of x & y. eqn(2) is more general.

Rewrite eqn(1) p.8-1 so it is equivalent to eqn(2) p.8-1 (HW?)

$$(au_{x})_x+(bu_{x})_y+(bu_{y})_x+(cu_{y})_y+du_{x}+eu_{y}+fu+g=0$$

(3) L9 Showing equation 2 and equation 1 are the same form

$$(au_{x})_x+(bu_{x})_y+(bu_{y})_x+(cu_{y})_y+du_{x}+eu_{y}+fu+g=0$$

$$[au_{xx}+a_{x}u_{x}]+[2bu_{xy}+b_{y}u_{x}+b_{x}u_{y}]+[cu_{yy}+c_{y}u{y}]+du_{x}+eu_{y}+fu+g=0$$

=>

$$au_{xx}+2bu_{xy}+cu_{yy}+[d+a_{x}+b_{y}]u_{x}+[e+b_{x}+c_{y}]u_{y}+fu+g=0$$

Noting that:

$$\overline{d}=[d+a_{x}+b_{y}]$$

$$\overline{e}=[e+b_{x}+c_{y}]$$

the format of equation 1 is recovered.

HW: Using equation (4) as a hint, go from equation (1) [p. 8-1] to equation (2)

=Transformation of coordinates: Linear, nonlinear= [picture of $$\Theta$$ rotation here]

Linear transformation: Rotation of coordinate axes.

$$\begin{Bmatrix} x\\ y \end{Bmatrix}=\underline{V} \begin{Bmatrix} \overline{x}\\ \overline{y} \end{Bmatrix}$$

HW:Find V for rotation of $$\Theta$$

L10

Report 1 problem: diffusion operator

$$\mathcal D (u):=div(\kappa \cdot grad\ u)$$

$$=\frac{\partial }{\partial x_{i}}(\kappa_{ij} \frac{\partial u}{\partial x_{j}})$$

Expanding:

$$\sum_{i=1}^{2} \sum_{j=1}^{2}\frac{\partial }{\partial x_{i}}(\kappa_{ij} \frac{\partial u}{\partial x_{j}})$$

$$\frac{\partial }{\partial x_{1}}(\kappa_{11} \frac{\partial u}{\partial x_{1}})+\frac{\partial }{\partial x_{1}}(\kappa_{12} \frac{\partial u}{\partial x_{2}})+\frac{\partial }{\partial x_{2}}(\kappa_{21} \frac{\partial u}{\partial x_{1}})+\frac{\partial }{\partial x_{2}}(\kappa_{22} \frac{\partial u}{\partial x_{2}})$$

Linear if $$\alpha$$ and $$\beta$$ are constants:

$$\mathcal L(\alpha u+\beta v)=\alpha \mathcal L(u)+ \beta \mathcal L(v)$$

$$\mathcal D(\alpha u+\beta v)=\alpha \mathcal D(u)+ \beta \mathcal D(v)$$

$$\forall \ u,v; \ \Omega\rightarrow  \Re, \ \forall \ \alpha, \beta \ \in \Re$$

$$div[\underline{\kappa} \cdot grad(\alpha u+ \beta v)] $$

Gradient is linear.

$$grad(\alpha u+ \beta v)=\alpha grad \ u+\beta grad \ v$$

Matrix multiplication is a linear operation.

$$\underline{A}(\alpha \cdot \underline{x}+\beta \cdot \underline{y})= \alpha \underline{A} \cdot \underline{x}+ \beta \underline{A} \cdot \underline{y}$$

$$\underline{\kappa}grad \cdot(\alpha \underline{u}+\beta\underline{v})= \alpha \underline{\kappa} \cdot grad\ u + \beta \underline{\kappa} \cdot grad\ v$$

Vector fields a, b : $$\Omega \rightarrow R^{3}$$

=Definition of a linear operator= Additivity and homogeneity. \forall \ u,v; \ \Omega\rightarrow  \Re\\ \mathcal L(u+v)=\mathcal L(u) + \mathcal L(v) \end{matrix}$$ \forall \alpha \in \Re, \ \forall u: \Omega \rightarrow \Re\\ \mathcal L (\alpha u)=\alpha \mathcal L(u) \end{matrix}$$
 * 1) Additive: $$\begin{matrix}
 * 1) Homogeneous: $$\begin{matrix}

Equivalence between $$\mathcal L (\alpha u+ \beta v)=\alpha \mathcal L(u)$$

To prove additivity, set $$alpha=beta=1$$=beta=1. To prove homogeneity set $$\beta=0$$.

Homogeneity of $$\mathcal L(\cdot)$$ select either $$alpha=0$$ or $$u=0$$.

Image of zero under $$\mathcal L$$ is zero if $$\mathcal L$$  is a linear operator.

$$\mathcal L(0)=0$$