Skip to main content

Section 1.3 Taylor Series

Taylor series are the central tool in Chapter 4, where I show how to solve differential equation by assuming the solution is a taylor series. This prelimination section briefly review the major definitions and results concerning taylor series.

Definition 1.3.1.

A function is analytic if it can be expressed as a taylor series.
\begin{equation*} f(x) = \sum_{n=0}^\infty c_n (x-\alpha)^n \end{equation*}
A taylor series is centered at a point \(\alpha\text{;}\) if \(\alpha = 0\) I call it a mcLaurin series. A series defines a function on some domain \((\alpha-R,\alpha+R)\) for some number \(R \geq 0\text{,}\) which is called a radius of convergence. If \(R=\infty\text{,}\) the series is defined on all real numbers. If \(R=0\text{,}\) the series is only defined at \(x=\alpha\) and is basically useless.
I use the ratio or root tests to calculate the radius of convergence of a series. After some manipulation of those tests, (if the coefficients are non-zero) the radius of convergence is given by the formula
\begin{equation*} R = \lim_{n \rightarrow \infty} \left| \frac{c_n}{c_{n+1}} \right| = \lim_{n \rightarrow \infty} \frac{1}{\sqrt[n]{|c_n|}}\text{.} \end{equation*}
Inside the radius of convergence, the behaviour of a taylor series is very reasonable. I can add and subtract terms when the indices match. I can multiply series like polynomials, though the calculation gets arduous. I can even divide using long division (though it is an infinite process). With multiplication and division (and with many other uses of series), I often only calculate the first few terms of the series.
There are two important manupulation techniques for series. The first is adjustment on indices.
\begin{equation*} \sum_{n=k}^\infty c_n x^n = \sum_{n=k+1}^\infty c_{n-1} x^{n-1} = \sum_{n=k-1}^\infty c_{n+1} x^{n+1} \end{equation*}
The second is removal of initial terms.
\begin{equation*} \sum_{n=0}^\infty c_n x_n = c_0 + c_1 x + c_2 x^2 + \sum_{n=3}^\infty c_n x_n \end{equation*}
Inside the radius of convergence, the calculus of taylor series is well behaved. I can integrate and differentiate term-wise.
\begin{align*} f(x) \amp = \sum_{n=0}^\infty c_n x^n\\ f^\prime(x) \amp = \sum_{n=1}^\infty c_n nx^{n-1}\\ \int f(x) dx \amp = \sum_{n=0}^\infty \frac{c_n x^{n+1}}{n+1} + c \end{align*}
In particular, I know that integrals and derivatives are always defined. This shows that analytic functions are necessarily \(C^\infty\) on the domain given by the radius of convergence. This is, in fact, an equivalence: any \(C^\infty\) function has a taylor series with some radius \(R\text{.}\)
Evaluating the derivatives of a series at the centre point \(\alpha\) gives a list of derivatives.
\begin{align*} f(\alpha) \amp = c_0\\ f^{\prime} (\alpha) \amp = c_1\\ f^{\prime \prime} (\alpha) \amp = 2c_2\\ f^{(3)} (\alpha) \amp = 2\cdot 3c_4\\ f^{(4)} (\alpha) \amp = 4!c_4\\ f^{(n)} (\alpha) \amp = n!c_n\\ c_n \amp = \frac{f^{(n)} (\alpha)}{n!} \end{align*}
This is a way to calculate coefficients, if I know the derivatives of a function. Let me end with one very important result.