PlanetPhysics/Taylor Series

Any power series represents on its convergence domain a function.\, One may set a converse task:\, If there is given a function $$f(x)$$, on which conditions one can represent it as a power series; how one can find the coefficients of the series?\, Then one comes to Taylor polynomials, Taylor formula and Taylor series.

Definition. \, The Taylor polynomial of degree $$n$$ of the function $$f(x)$$ in the point\, $$x = a$$\, means the polynomial $$T_n(x,a)$$ of degree at most $$n$$, which has in the point the value $$f(a)$$ and for which the derivatives $$T_n^{(j)}(x,a)$$ up to the order $$n$$ have the values $$f^{(j)}(a)$$.

It is easily found that the Taylor polynomial in question is uniquely $$\begin{matrix} T_n(x,a) \;=\; f(a)+\frac{f'(a)}{1!}(x\!-\!a)+\frac{f''(a)}{2!}(x\!-\!a)^2+ \ldots+\frac{f^{(n)}(a)}{n!}(x\!-\!a)^n \end{matrix}$$

When a given function $$f(x)$$ is replaced by its Taylor polynomial $$T_n(x,a)$$, it's important to examine, how accurately the polynomial approximates the function, in other words one has to examine the difference $$f(x)\!-\!T_n(x,a) \;:=\; R_n(x).$$ Then one is led to the\\

Taylor formula. \, If $$f(x)$$ has in a neighbourhood of the point\, $$x = a$$\, the continuous derivatives up to the order $$n\!+\!1$$, then it can be represented in the form $$\begin{matrix} f(x) \,=\ f(a)\!+\!\frac{f'(a)}{1!}(x\!-\!a)\!+\!\frac{f''(a)}{2!}(x\!-\!a)^2\!+ \ldots+\!\frac{f^{(n)}(a)}{n!}(x\!-\!a)^n\!+\!R_n(x) \end{matrix}$$ with $$R_n(x) \;=\; \frac{f^{(n+1)}(\xi)}{(n\!+\!1)!}(x\!-\!a)^{n+1}$$ where $$\xi$$ lies between $$a$$ and $$x$$.\\

If the function $$f(x)$$ has in a neighbourhood of the point\, $$x = a$$\, the derivatives of all orders, then one can let $$n$$ tend to infinity in the Taylor formula (2).\, One obtains the so-called Taylor series $$\begin{matrix} \sum_{n=0}^{\infty}\frac{f^{(n)}(a)}{n!}(x\!-\!a)^n \;=\; f(a)+\frac{f'(a)}{1!}(x\!-\!a)+\frac{f''(a)}{2!}(x\!-\!a)^2+\ldots \end{matrix}$$

\htmladdnormallink{theorem {http://planetphysics.us/encyclopedia/Formula.html}.}\, A necessary and sufficient condition for that the Taylor series (3) converges and that its sum represents the function $$f(x)$$ at certain values of $$x$$ is that the limit of $$R_n(x)$$ is 0 as $$n$$ tends to infinity.\, For these values of $$x$$ on may write $$\begin{matrix} f(x) \;=\; f(a)+\frac{f'(a)}{1!}(x\!-\!a)+\frac{f''(a)}{2!}(x\!-\!a)^2+\ldots \end{matrix}$$

The most known Taylor series is perhaps $$e^x \;=\; 1+\frac{x}{1!}+\frac{x^2}{2!}+\ldots$$ which is valid for all real (and complex) values of $$x$$.\\

There are analogical generalisations of Taylor theorem and series for functions of several real variables; then the existence of the partial derivatives is needed.\, For example for the function $$f(x,y,z)$$ the Taylor series looks as follows: $$ f(X,Y,Z) \;=\; f(a,b,c)+\sum_{n=1}^{\infty} \left[\frac{1}{n!}\!\left( (X\!-\!a)\frac{\partial}{\partial x}+ (Y\!-\!b)\frac{\partial}{\partial y}+ (Z\!-\!c)\frac{\partial}{\partial z}\right)^n\!f\right]_{(x,y,z)=(a,b,c)} $$