User:Egm6321.f12.team3/report2

= R*2.1 Verification that homogeneous solutions satisfy O.D.E. = I solved it on my own

Problem
Verify that : $$ L_2(y^1_H)=L_2(y^2_H)=0$$

Given
$$L_2$$ is the Legendre differential equation / operator:

$$ \displaystyle L_2(\cdot)=(1-x^2)\frac{d^2(\cdot)}{dx^2}-2x\frac{d(\cdot)}{dx}+n(n+1)(\cdot)=0 $$

For the case where n=1: $$ \displaystyle L_2(\cdot)=(1-x^2)\frac{d^2(\cdot)}{dx^2}-2x\frac{d(\cdot)}{dx}+2(\cdot)$$

Two linearly-independent homogeneous solutions:

$$ y^1_H(x) = x $$

$$ \displaystyle y^2_H(x) = \frac{x}{2}\ln(\frac{1+x}{1-x})-1 $$

Solution
$$ \displaystyle \frac{d^2(x)}{dx^2} = 0 $$

$$ \displaystyle \frac{d(x)}{dx} = 1 $$

$$ \displaystyle L_2(y^1_H)=L_2(x)= -2x + 2x = 0 $$

$$ L_2(y^1_H) = 0 $$

Apply the Product Rule to take the derivative of $$\displaystyle \frac{x}{2}\ln(\frac{1+x}{1-x})$$

$$ \displaystyle \frac{d}{dx}(f(x)g(x)) = f'(x)g(x)+f(x)g'(x) $$

The argument of the natural logarithm is also a function, so we must apply the chain rule. The argument is a ratio of two functions, so the quotient rule is used to differentiate it.

$$ \displaystyle f(x) = \frac{g(x)}{h(x)} \rightarrow f'(x) = \frac{g'(x)h(x)-g(x)h'(x)}{[h(x)]^2} $$

Now to find the second derivative of $$ y^2_H $$ with respect to x  $$ \displaystyle \frac{d^2}{dx^2}[\frac{x}{2}\ln(\frac{1+x}{1-x})-1]= \frac{d}{dx}[\frac{1}{2}(\ln(\frac{1+x}{1-x}) + \frac{x}{1-x^2}] $$

Apply the Quotient Rule to take the derivative of the second term:

Now put it all together:

The natural logarithm terms cancel and the terms with a common denominator sum to 1: $$ \displaystyle \frac{1+x^2}{1-x^2} + \frac{-2x^2}{1-x^2} = \frac{1+x^2-2x^2}{1-x^2} = \frac{1-x^2}{1-x^2} = 1 $$

$$ \displaystyle L_2(y^2_H) = 1 + 1 -2 = 0 $$

$$ \displaystyle L_2(y^2_H) = 0 $$

Hence proved, $$ \displaystyle L_2(y^1_H) = L_2(y^2_H) = 0 $$

=R*2.2 Verification of Solution of a Differential Equation= I solved it on my own

Problem
Verify that $$ \displaystyle p(x)=k_1e^{-x}+x-1$$ is indeed the solution for $$ \displaystyle \dot p +p=x $$

Given
$$ \displaystyle \dot p = \frac {d}{dx}(p(x))$$

Solution
$$ \displaystyle \frac{d}{dx}p(x)= \dot p =\frac{d}{dx}(k_1e^{-x}+x-1) $$

$$ \displaystyle=\frac{d}{dx}(k_1e^{-x})+\frac{d}{dx}(x)-\frac{d}{dx}(1) $$

Now, from (Eq.2.1) and (Eq.2.2),

$$\displaystyle \dot p +p=(-k_1e^{-x}+1)+(k_1e^{-x}+x-1) = x $$

Since $$\displaystyle \dot p + p = x $$ for $$\displaystyle p(x)=k_1e^{-x}+x-1, p(x) $$  is indeed a solution of the given differential equation.

=R*2.3 Showing that the particular class of N1-ODEs is affine in y'= I solved it on my own

Problem
Show That : {| style="width:100%" border="0" align="left" Give an example of a more general N1-ODE.
 * }
 * (Eq.3.2) is affine in y',and that (Eq.3.2)is in general an N1-ODE, But (Eq.3.2) is not the most general N1-ODE as represented by (Eq.3.1).
 * (Eq.3.2) is affine in y',and that (Eq.3.2)is in general an N1-ODE, But (Eq.3.2) is not the most general N1-ODE as represented by (Eq.3.1).

Solution
to prove that (Eq.3.2) is affine in y', we need to show that (Eq.3.4) + (Eq.3.5)
 * }

Since $$ \displaystyle \ \alpha, \beta$$ can be selected arbitrarily. (Eq.3.7) becomes Hence therefore (Eq.3.2) is affine in y'. (Eq.3.1) is a more general N1-ODE.

= R*2.4 Linear Independence of Two Homogeneous Solutions of Legendre Differential Equation= I solved it on my own

Given
Two Homogeneous Solutions of Legendre Differential Equation (order n=1),

Problem
Show that $$ \displaystyle y^1_H(\cdot) \ne \alpha y^2_H(\cdot), \forall \alpha \in \mathbb R. $$ or in other words, Prove that $$ \displaystyle \exists \hat x $$ such that $$ \displaystyle y^1_H(\hat x) \ne \alpha y^2_H(\hat x). $$

Also, Plot the two homogeneous solutions.

Proof
Let us first consider special cases and then move towards the general case. Let us assign values of 0, 1, -1, any positive real number > 1 $$ \displaystyle (1,\infty)$$, any negative real number < -1 $$ \displaystyle (-\infty,-1)$$, to the variable x in the same order and find the values of the functions at these points.

Case 1:

$$ \displaystyle x=0$$

Clearly, (Eq.4.3) is NOT equal to (Eq.4.4) and for no value of $$ \displaystyle \alpha $$, the two equations can be equated as $$ \displaystyle \alpha $$ is multiplied by zero.

Case 2:

$$ \displaystyle x=1$$

Again, (Eq.4.5) is NOT equal to (Eq.4.6) and for no value of $$ \displaystyle \alpha $$, the two equations can be equated as $$ \displaystyle \alpha $$ is multiplied by Infinity which is absurd.

Case 3:

$$ \displaystyle x=-1$$

Although Log of 0 is undefined, it is generally taken to be negative infinity to symbolize.

Hence, (Eq.4.7) $$ \displaystyle \ne $$ (Eq.4.8) and for no value of $$ \displaystyle \alpha $$, the two equations can be equated as $$ \displaystyle \alpha $$ is multiplied by Negative Infinity which is again absurd.

Case 4:

x is any positive real number > 1, a, where $$ \displaystyle a \in (1,\infty)$$.

(Eq.4.9) gives a Real Value but in (Eq.4.10), Log of a negative number is a complex number (a number with real and imaginary parts). Hence, (Eq.4.9) $$ \displaystyle \ne $$ (Eq.4.10) and for no REAL value of $$ \displaystyle \alpha $$, the two equations can be equated as $$ \displaystyle \alpha $$ is multiplied by a Complex Number.

Case 5:

x is any negative real number < 1, -b (the negative sign is retained just to drive home the point more conveniently. Hence, (-b) as a whole should be treated a negative number.), where $$ \displaystyle -b \in (-\infty,-1)$$.

(Eq.4.11) gives a Real Value but in (Eq.4.12), Log of a negative number is a complex number (a number with real and imaginary parts). Hence, (Eq.4.11) $$ \displaystyle \ne $$ (Eq.4.12) and for no REAL value of $$ \displaystyle \alpha $$, the two equations can be equated as $$ \displaystyle \alpha $$ is multiplied by a Complex Number.

The Last Two Cases (the general cases) have clearly shown that there can exist no value of $$ \displaystyle \alpha \in \mathbb R $$ that can make the given two homogeneous solutions dependent, i.e., multiple of one another.

Hence, $$ \displaystyle y^1_H(\cdot) \ne \alpha y^2_H(\cdot), \forall \alpha \in \mathbb R. $$

Graph
= R*2.5 Verification of N1-ODE conditions on a Derivative of an Equation= I solved it on my own

Problem
Consider the function (2) p.9-2. Find (3) p.9-2 and show that it is an N1-ODE.

Given
These equations are referred to below.

Solution
In order to determine (3) p.9-2, the derivative with respect to x must be found.

The first two derivatives are worked individually using the product rule, as seen below:

$$ \displaystyle \frac{d}{dx}(f(x)g(x)) = f(x)g'(x)+g(x)f'(x) $$

Term 1

Term 2 By combining (Eq.5.2) and (Eq.5.4), (3) p.9-2 can be determined. Further algebra will simplify this function: This (Eq.5.8) is the same as (2) p.7-6, which proves that (3) p.9-2 can be written in the particular class form of a N1-ODE, therefore satisfying the First Exactness Condition.

In addition: Therefore: As a result, the Second Exactness Condition is also satisfied. In conclusion, (2) p.9-2 is an exact N1-ODE because it satisfies both the Exactness Conditions.

= R*2.6 Proof for Clairaut's theorem and Differentiability = We solved it on our own

Problem
Review calculus, and find the minimum degree of differentiability of the function $$\phi(x,y)$$ (9.4 Eq. 2) such that it satisfies $$ \frac{\partial^2 \phi(x,y)}{\partial x \partial y}=\frac{\partial^2 \phi(x,y)}{\partial y \partial x} $$

Also, state and prove Clairaut's theorem.

Solution
In mathematical analysis, Clairaut's theorem or Schwarz's theorem,[1] named after Alexis Clairaut and Hermann Schwarz, states that if $$f:\mathbb R \rightarrow \mathbb R$$ has continuous second partial derivatives at any given point in $$ \mathbb R^n $$,say $$(a_1,...,a_n)$$, then for $$ 1<i,$$  $$ j<n,$$

$$\frac{\partial^2 f}{\partial x_i \partial x_j}(a_1,...,a_n)=\frac{\partial^2 f}{\partial x_j \partial x_i}(a_1,...,a_n)$$

The minimum degree of differentiability of the function such that (Eq.6.1)is satisfied is 2.

As Clairaut's theorem mentions, functions that satisfy (Eq.6.1) must have continuous second derivative.

For instance, the function $$ z=f(x,y)=x^2+y^2 $$ satisfies (Eq.6.1).

Example
The minimum degree of differentiability of the function such that (Eq.6.1) is satisfied is TWO.

Lets consider a function $$F(x,y)$$ :

$$ \displaystyle F=x^3y^3\sin(\frac{1}{xy}) $$

The mixed second partial derivatives of F are:

$$ \displaystyle F_{xy}=(9x^2y^2-1)\sin(\frac{1}{xy})-5xy\cos(\frac{1}{xy}) $$

$$ \displaystyle F_{yx}=(9x^2y^2-1)\sin(\frac{1}{xy})-5xy\cos(\frac{1}{xy}) $$

If we have x,y such that $$xy=0$$

$$ \displaystyle F_{xy}=F_{yx}=0 $$

When $$ xy\ne0 $$, we have,

$$ \displaystyle F_{xy}=(9x^2y^2-1)\sin(\frac{1}{xy})-5xy\cos(\frac{1}{xy})= F_{yx} $$

Thus $$ \displaystyle F_{xy}= F_{yx} $$ for all values.

Differentiating $$ F_{xy} $$ and $$ F_{yx} $$, we have

$$ \displaystyle F_{xyx}=F_{yxx}=18xy^2\sin(\frac{1}{xy})-14y\cos(\frac{1}{xy})+\frac{\cos(\frac{1}{xy})}{yx^2}-\frac{5\sin(\frac{1}{xy})}{x} $$

$$ \displaystyle F_{xyy}=F_{yxy}=18yx^2\sin(\frac{1}{xy})-14x\cos(\frac{1}{xy})+\frac{\cos(\frac{1}{xy})}{xy^2}-\frac{5\sin(\frac{1}{xy})}{y} $$

Checking the values of mixed third partial derivatives of function $$F(x,y)$$ if $$ xy=0 $$, we may notice that:

$$ \displaystyle \frac{\cos(\frac{1}{xy})}{yx^2} $$, $$ \displaystyle \frac{\cos(\frac{1}{xy})}{xy^2} $$

The values of third partial derivatives are all infinite, i.e. function $$ F(x,y) $$ is not third differentiable at any point satisfies $$ xy=0 $$. But we have obtained that:

$$ \displaystyle F_{xy}= F_{yx} $$

Thus, at any point satisfying $$ xy=0 $$, function F has only second degree of differentiability, and its mixed second partial derivatives satisfy the (Eq.6.1).

Thus, reiterating, only a second degree of differentiability is needed for establishing (Eq.6.1), and the minimum degree of differentiability of function to satisfy (Eq.6.1) is two.

The example above also proves that the continuity of second partial derivatives at given point, which is a weaker condition than existence of third partial derivatives, is not necessary condition for $$ F_{xy}= F_{yx} $$, that is, given $$ F_{xy}= F_{yx} $$ at given point, we could not conclude that the second partial derivatives of $$ F(x,y)$$ are absolutely continuous at this point. Check the values of second partial derivatives of F in neighborhood of one given point P that satisfies $$ xy=0 $$, it is obvious that, the value of

$$ \displaystyle F_{xy},F_{yx}=(9x^2y^2-1)\sin(\frac{1}{xy})-5xy\cos(\frac{1}{xy}) $$

is vibrating continuously for infinite times while approaching P. Therefore we can conclude that P is a discontinuous point of the second kind of function $$ F_{xy} $$ and $$ F_{yx} $$, i.e., '''the mixed second partial derivatives of F are not continuous at P that satisfy $$ xy=0 $$. But $$ F_{xy} $$ and $$ F_{yx} $$ at P satisfie the (Eq.6.1).''' Here we reach the conclusion that the continuity of second partial derivatives at given point is not necessary condition for $$ F_{xy}= F_{yx} $$.

Clairaut's theorem Proof
We can prove this theorem using the mean value theorem.

Mean Value Theorem Let $$\displaystyle f:\mathbb {R}^2\rightarrow \mathbb {R}$$ is differentiable. Let $$\displaystyle X_{0} = (x_{0}, y_{0})$$ and $$\displaystyle X = (x_{0} + h, y_{0} + k)$$.

Then there exists $$\displaystyle C$$ which lies on the line joining $$\displaystyle X_{0}$$ and $$\displaystyle X$$ such that

i.e, there exists $$ c\in (0,1) \displaystyle $$ such that

where $$\displaystyle C = (x_{0} + ch, y_{0} + ck)$$

Moving on to the proof of Clairaut's theorem,

Let $$\displaystyle f:\mathbb [a,b]\rightarrow \mathbb R$$

and

According to the Mean Value Theorem, we know that

Because

From $$(Eq.6.4)$$ and $$(Eq.6.5)$$ we get,

The RHS of $$(Eq.6.6)$$, can be expressed as,

According to MVT again, we have

Now $$(Eq. 9.6)$$ becomes,

Since $$\frac{\partial^{2}f}{\partial x\partial y}(a,b) $$ and $$\frac{\partial^{2}f}{\partial y\partial x}(a,b) $$ are both continuous in neighborhood of point $$\mathbb [a,b]$$, the limitation of $$\lim\frac{F_{1}}{hk} $$ and $$\lim\frac{F_{1}}{kh} $$ both exist as $$h,k\rightarrow 0$$.

Thus, when $$\displaystyle h\rightarrow 0$$, $$\displaystyle k\rightarrow 0$$, the limit

This process can be repeated for a different order of differentiation to yield,

Since $$\displaystyle [a,b]\rightarrow \mathbb {R}$$, we have

$$\displaystyle \frac{\partial^{2}f}{\partial x \partial y}(x,y)=\frac{\partial^{2}f}{\partial y \partial x}(x,y) $$

= R*2.7 Verify solution of exact nonlinear first order O.D.E. = I solved it on my own

Problem
Verify that Equation (1) p.10-5 is the solution for the nonlinear first order ordinary differential equation Equation (2) p.8-6.

Given
$$ y(x) = \sin^{-1}(k-15x^5) $$ where $$k$$ is a constant of integration.

$$ \displaystyle \frac{d\phi(x,y(x))}{dx} = 75x^4+cosy(x)y'(x) = 0$$

Solution
Multiply all the terms in the equation by the differential $$dx$$:

Now integrate the equation:

Now solve for $$y(x)$$

$$ y(x) = \sin^{-1}(k-15x^5) $$

Thus the solution has been verified.

=R*2.8 Solution to the Integrating Factor= I solved it on my own

Given
From meeting 11 - equation 4 in P11.3:

Problem
Why solving $$(Eq. 8.1)$$ for $$\displaystyle h(x,y) $$ is usually not easy ?

Solution
Solving equation $$(Eq. 8.1)$$ for the integrating factor $$\displaystyle h(x,y) $$ is usually not easy as $$(Eq. 8.1)$$ is a non-linear partial differential equation with varying co-efficients such as $$\displaystyle h_x, h_y, N_x, M_y $$.

Moreover, the presence of partial derivatives of h with respect to both x and y, $$\displaystyle h_x, h_y$$ in the same equation increases the complexity of the problem.

This makes it very difficult to solve $$(Eq. 8.1)$$ for $$\displaystyle h(x,y)$$.

=R*2.9 Particular solution to Euler Integrating Factor= I solved it on my own

Given
Equation 1 from page 11.3 of meeting 11 :

Problem
Find h in $$(Eq. 9.1)$$ where h is the Euler Integration Factor.

Solution
When $$h_x(x,y) = 0$$, the function $$h$$ in Equation 4 from page 11.1 in lecture 11 reduces to a function of y only.

Integrating w.r.t y on $$Eq(9.1)$$ ,

$$\displaystyle \Rightarrow logh(y)=\int ^{y}m(s)ds+k$$

$$\displaystyle \Rightarrow h(y)=exp[\int ^{y}m(s)+k]$$

which is the required solution for $$h$$

=R*2.10 Verification of Solution of a Specific non-homogenous L1-ODE-VC= I solved it on my own

Problem
show that:

Solution
From Eq(2)p.11-2 and Eq(2)p.11-4 integrate (Eq.10.4) Then Now let's go back to (Eq.10.2) with (Eq.10.5) becomes After integration (Eq.10.6) becomes. with (Eq.10.7) becomes

Thus $$ \displaystyle y(x)=\frac{x^3}{4} + \frac{k}{x}$$, Hence Proved

= R*2.11 Solution to three General L1-ODEs with Varying Coefficients= I solved it on my own

Solution
Replacing $$a_1(x) $$, $$a_0(x) $$ and $$b_1(x) $$ with their given expression or value in this first problem respectively, (Eq.11.1) can be written as Thus we have And Substituting from (Eq.11.3) into (Eq.11.4) gives that:

To check the condition(2) p.11-2:

multiply both side of the (Eq.11.6) by $$ \frac{1}{a_1(x)}$$

(Eq.11.6) becomes

with (Eq.11.7) becomes After integration (Eq.11.8) becomes.

Then Because

Replacing $$a_1(x) $$, $$a_0(x) $$ and $$b_1(x) $$ with their given expression or value in this first problem respectively, (Eq.11.1) can be written as Thus we have And Where Substituting from (Eq.11.15) into (Eq.11.14) gives that: Substituting from (Eq.11.16) into (Eq.11.13) gives that:

=Contributing Members=

= References =