Surreal number/Root 2

■ Surreal number/Root 2 begins by discussing how to construct a pair of infinite series that converge to $$\sqrt 2$$ (one from above and one from below). One derivation of this series is presented, but we also hint at how Newton's method might be used to construct the same pair of sequences. Later, we discuss how to generalize this method to include other roots $$(x^{1/n})$$. These pairs of sequences can be used to express surreal forms for irrational numbers that are the n-th root of an integer. - Dyadic fractions $(0,\pm 1, \pm \tfrac 1 4, ...\,\pm \tfrac 9 8,...)$ can be counted using a pair of integers by writing them as rational fractions of the form,  $$b/2^\tau,\,$$where $$b\ge 0$$ counts the birthdays and $$|\tau\le b|$$ counts the births associated with each birthday. Let the # refer to when a dyadic is reached using the convention that we count births from left to right on the number line:

$$\overbrace{0}^{0}, \overbrace{\underbrace{a1,a2}_{c}}^{1}

$$

-1,1,-2,-\tfrac{-3}{2}

$$\mathcal{*QUESTIONS:}$$


 * 1) One way to count the dyadic fractions is

We begin this discussion with the task of finding an infinite sequence of rational numbers that converges to $$\sqrt 2$$:

$$\{a_1, a_2, \ldots,a_n, a_{n+1}, \ldots\}$$ where $$\lim_{n\to\infty}a_n=\sqrt 2 = \frac{2}{\sqrt 2}$$

Irrational roots
e.g, $$z=\alpha x + \beta y$$ is the weighted average of $$x$$ and $$y$$.

To construct a recursion relation, we start with a trivial identity, valid whenever $$\alpha+\beta=1:$$

$$\sqrt 2 =\frac 2 \sqrt 2 \overbrace{(\alpha+\beta)}^{1}\,=\, \alpha\,\sqrt 2 \,+\, \beta\,\frac{2}{\sqrt 2}.\quad\quad \boxplus $$

Note the third form of this expression is a weighted average of $$\sqrt 2$$ and $$2/\sqrt 2.\,$$ It turns out that $\alpha=\beta=1/2.$  By retaining $$(\alpha,\beta)$$ as unknown variables, we give ourselves the opportunity to see if this method works on a number like $$\sqrt[3]{17}).\,$$ Since the intent is to converge to $$\sqrt 2$$, we now define an "error sequence", $$\Delta_n,$$ and seek values of $$(\alpha,\beta)\,$$ that cause the error sequence to converge to zero, i.e., we want to force $$\Delta_n\to 0 $$ as $$n\to\infty. $$ This "error sequence" is defined as

$$a_n=\sqrt 2 \left(1+\Delta_n\right),$$

$$\mathcal{*QUESTION:}\;$$ Why not use $$\,a_n=\sqrt 2+\Delta_n\,$$ to define the error sequence? Hint: the answer involves the need to establish when $$\Delta_n$$ is likely to be small, especially when when attempting to generalize this calculation to finding the square root of a large number.

Replacing $$\sqrt 2$$ by $$a_n$$ into $$\boxplus$$ (above) creates the following weighted average of two numbers that are almost equal:

$$a_{n+1}=\alpha a_n + \beta\frac{2}{a_n}$$

Expressing this in terms of our error sequence, we have:

$$\sqrt 2(1+\Delta_{n+1})= \sqrt 2\alpha(1+\Delta_n)+\frac{2\beta }{\sqrt 2(1+\Delta_n)}\quad\quad\odot$$

In contrast with $$\boxplus,\,$$the two terms on the RHS of $$\odot$$ are not equal. If one term is slightly larger than $$\sqrt 2,\,$$then the other is slightly smaller, as can be seen from a Taylor expansion,

$$\frac{1}{1+\Delta_n}=1-\Delta_n+\Delta_n^2-\Delta_n^3+\mathcal O{(\Delta_n^4)}\,,$$

which informs us, that for sufficiently small values of $$\Delta_n,\,$$the next iteration, $$\Delta_{n+1},\,$$will be smaller than $$\Delta_n.\,$$ Substitution of the Taylor expansion

$$ a_{n+1} = \sqrt 2(1+\Delta_{n+1})= \overbrace{ \underbrace{\sqrt 2\alpha}_A + \underbrace{\sqrt 2\alpha\Delta_n}_C}^{\sqrt 2\alpha(1+\Delta_n) } + \overbrace{ \underbrace{\frac{2\beta}{\sqrt 2}}_B - \underbrace{\frac{2\beta}{\sqrt 2}\Delta_n + \frac{2\beta}{\sqrt 2}\Delta_n^2+\mathcal O(\Delta_n^3) }_{D=D_1+D_2+D_3}}^{\frac{2\beta }{\sqrt 2(1+\Delta)} } $$

In the above equation, $$ A+B=\sqrt2\,, $$ so that it will cancel the zeroth order term, $$ \sqrt 2,\, $$ on the LHS. Term C in the above expression is linear, or "first order", in the parameter $$\Delta_n.$$ Also, when we add C+D, we want C to cancel the first order term, i.e.,

$$\sqrt 2\alpha\Delta_n - \frac{2\beta}{\sqrt 2}\Delta_n =0\;\implies\; \underline{\alpha=\beta=\tfrac 12}$$

For term B, we obtain two different recursion relations; the first starts at, $$ a_1=2 $$:

$$ \begin{matrix} a_1=2    \qquad & a_{n+1}=\frac{a_n}{2} + \frac{1}{a_n} \\[0.2ex] b_1=1     \qquad & b_{n+1}=\frac{4}{b_n+2/b_n} \end{matrix} $$

Appendix (*QUESTIONS & *Footnotes)
$$\mathcal{*QUESTIONS:}$$

$$\mathcal{*Footnotes:}$$
 * 1) Can you derive the recursion relation for $$b_n$$? Hint: Try the substitution $b_n=\frac 1 2.$
 * 2) Find $$a_2$$ if $$a_1 = 22/7.$$
 * 3) Can this method be used to calculate the recursion relation for $$\sqrt[3]{5}\,?$$
 * 4) Can you prove that $$a_n$$is monotonically decreasing?
 * 5) One can also create a sequence of approximations for calculating roots of a number using Newton's method. Verify that Newton's method can be used to obtain the sequence shown above.
 * 6) Newton's method can be explained graphically by drawing a tangent line near the function f(x) at a location near a point where $$f(x)=0,$$ and then finding the intersection of that tangent line with the x axis. The slope of a tangent line to a graph of $$f(x)$$ versus $$x$$ is well known to be the first derivative, $$f'(x).$$ The methods used above to obtain the sequence $$a_n$$ involve a Taylor expansion, which also involves $$f'(x).$$ Investigate whether the formula for Newton's method can be derived using the Taylor expansion method used above.

Wikitable
Spreadsheets from
 * https://docs.google.com/spreadsheets/d/1XJTHPOwOUx8nTvx9hc6h5oWaj2ZHVrJhcOWjjs0JnzM/edit#gid=0