Convergence of Newton Method for monotonic polynomialsNewton-Horner Method ExampleInexact Newton method.Numerical iterative method for equation with $cos(x)$Show that Newton method converges for every choice of $x_0$Finding all roots of multivariate polynomial using Newton's methodConvergence of a variant of Newton's MethodProof of convergence of newton method for convex functionAlmost sure convergence of Newton's methodConvergence of Newton's method for polynomialsConvergence of Newton method and machine precision
Found and corrected a mistake on someone's else paper -- praxis?
Is it possible to split a vertex?
What minifigure is this?
How do native German speakers usually express skepticism (using even) about a premise?
Is this a reference to the film Alien in the novel 2010 Odyssey Two?
What happens when adult Billy Batson says "Shazam"?
Why weren't bootable game disks ever a thing on the IBM PC?
What is the right approach to quit a job during probation period for a competing offer?
How to drill holes in 3/8" steel plates?
The joke office
Why is the ladder of the LM always in the dark side of the LM?
Would it be appropriate to sand a floor between coats of poly with a handheld orbital sander?
What is /bin/red
Credit score and financing new car
Intelligent Ants in the Amazon
What is a "Lear Processor" and how did it work?
How often does the spell Sleet Storm require concentration checks?
Would a carnivorous diet be able to support a giant worm?
Using Open with a filename that contains :
WTB Horizon 47c - small crack in the middle of the tire
"was fiction" vs "were fictions"
What is the parallel of Day of the Dead with Stranger things?
How do we handle pauses in a dialogue?
Is that a case of "DOUBLE-NEGATIVES" as claimed by Grammarly?
Convergence of Newton Method for monotonic polynomials
Newton-Horner Method ExampleInexact Newton method.Numerical iterative method for equation with $cos(x)$Show that Newton method converges for every choice of $x_0$Finding all roots of multivariate polynomial using Newton's methodConvergence of a variant of Newton's MethodProof of convergence of newton method for convex functionAlmost sure convergence of Newton's methodConvergence of Newton's method for polynomialsConvergence of Newton method and machine precision
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
$begingroup$
Consider a polynomial $p : mathbb R to mathbb R$ with $p'(x) > 0$ for all $x in mathbb R$. The function $p$ has exactly one real zero. Will the Newton method
$$x_n+1 = x_n - fracp(x_n)p'(x_n)$$
converge for all $x_0 in mathbb R$?
Intuitively I think there still might be a counterexample - but I couldn't find one, so is it maybe possible that it indeed converges for every $x_0$?
polynomials numerical-methods roots newton-raphson
$endgroup$
add a comment |
$begingroup$
Consider a polynomial $p : mathbb R to mathbb R$ with $p'(x) > 0$ for all $x in mathbb R$. The function $p$ has exactly one real zero. Will the Newton method
$$x_n+1 = x_n - fracp(x_n)p'(x_n)$$
converge for all $x_0 in mathbb R$?
Intuitively I think there still might be a counterexample - but I couldn't find one, so is it maybe possible that it indeed converges for every $x_0$?
polynomials numerical-methods roots newton-raphson
$endgroup$
$begingroup$
Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) geq 2$ can never be satisfied.
$endgroup$
– araomis
Jul 1 at 8:07
$begingroup$
@flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$.
$endgroup$
– PierreCarre
Jul 1 at 9:53
add a comment |
$begingroup$
Consider a polynomial $p : mathbb R to mathbb R$ with $p'(x) > 0$ for all $x in mathbb R$. The function $p$ has exactly one real zero. Will the Newton method
$$x_n+1 = x_n - fracp(x_n)p'(x_n)$$
converge for all $x_0 in mathbb R$?
Intuitively I think there still might be a counterexample - but I couldn't find one, so is it maybe possible that it indeed converges for every $x_0$?
polynomials numerical-methods roots newton-raphson
$endgroup$
Consider a polynomial $p : mathbb R to mathbb R$ with $p'(x) > 0$ for all $x in mathbb R$. The function $p$ has exactly one real zero. Will the Newton method
$$x_n+1 = x_n - fracp(x_n)p'(x_n)$$
converge for all $x_0 in mathbb R$?
Intuitively I think there still might be a counterexample - but I couldn't find one, so is it maybe possible that it indeed converges for every $x_0$?
polynomials numerical-methods roots newton-raphson
polynomials numerical-methods roots newton-raphson
asked Jul 1 at 7:14
flawrflawr
12.1k3 gold badges25 silver badges47 bronze badges
12.1k3 gold badges25 silver badges47 bronze badges
$begingroup$
Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) geq 2$ can never be satisfied.
$endgroup$
– araomis
Jul 1 at 8:07
$begingroup$
@flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$.
$endgroup$
– PierreCarre
Jul 1 at 9:53
add a comment |
$begingroup$
Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) geq 2$ can never be satisfied.
$endgroup$
– araomis
Jul 1 at 8:07
$begingroup$
@flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$.
$endgroup$
– PierreCarre
Jul 1 at 9:53
$begingroup$
Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) geq 2$ can never be satisfied.
$endgroup$
– araomis
Jul 1 at 8:07
$begingroup$
Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) geq 2$ can never be satisfied.
$endgroup$
– araomis
Jul 1 at 8:07
$begingroup$
@flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$.
$endgroup$
– PierreCarre
Jul 1 at 9:53
$begingroup$
@flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$.
$endgroup$
– PierreCarre
Jul 1 at 9:53
add a comment |
1 Answer
1
active
oldest
votes
$begingroup$
No, not necessarily. In particular, consider the polynomial,
$$p(x) = frac72x - frac52x^3 + x^5.$$
Note that
$$p'(x) = frac72 - frac152x^2 + 5x^4,$$
a positive quadratic in $x^2$ with a negative discriminant $-frac554$, and hence is strictly positive everywhere. This means $p$ is strictly increasing, as required.
Take an initial iterate $x_0 = 1$. Then,
beginalign*
x_1 &= 1 - fracp(1)p'(1) = 1 - frac21 = -1 \
x_2 &= -1 - fracp(-1)p'(-1) = -1 - frac-21 = 1,
endalign*
and so the iterates repeat.
Method
It's not difficult to form a cycle with Newton's method. I wanted a polynomial that passes through $(-1, -2)$ and $(1, 2)$, both with derivative $1$. Following the tangent at $(-1, -2)$ to the $x$-axis will yield $x = 1$, so if we take $x_n = -1$, then $x_n+1 = 1$. Following the tangent at $(1, 2)$ yields an $x$-intercept of $x = -1$, so $x_n+2 = -1$, and so the iterates repeat.
I tried for a degree $5$ polynomial. Let our polynomial be
$$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + a_5 x^5.$$
Our restrictions turn into
beginalign*
p(-1) = a_0 - a_1 + a_2 - a_3 + a_4 - a_5 &= -2 \
p'(-1) = a_1 - 2a_2 + 3a_3 - 4a_4 + 5a_5 &= 1 \
p(1) = a_0 + a_1 + a_2 + a_3 + a_4 + a_5 &= 2 \
p'(-1) = a_1 + 2a_2 + 3a_3 + 4a_4 + 5a_5 &= 1.
endalign*
Solving this, we get a general solution in terms of $s$ and $t$:
$$p(x) = frac52x - frac12x^3 + t(1 - 2x^2 + x^4) + s(x - 2x^3 + x^5).$$
In particular, I needed to choose $s$ and $t$ so that the derivative
beginalign*
p'(x) &= frac52 - frac32x^2 + t(3x^3 - 4x) + s(1 - 6x^2 + 5x^4) > 0
endalign*
Clearly, we required $s > 0$. I also decided to chose $t = 0$; it may not have been necessary, but it made things simpler. Now $p'(x)$ is now a cubic in $x^2$:
$$p'(x) = left(frac52 + sright) - left(frac32 + 6sright)x^2 + 5sx^4.$$
I wanted the discriminant to be negative, which is to say
$$left(frac32 + sright)^2 - 20sleft(frac52 + sright) < 0.$$
Choosing $s = 1$ did the trick, and gave us the previously presented polynomial.
$endgroup$
1
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "69"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3279513%2fconvergence-of-newton-method-for-monotonic-polynomials%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
No, not necessarily. In particular, consider the polynomial,
$$p(x) = frac72x - frac52x^3 + x^5.$$
Note that
$$p'(x) = frac72 - frac152x^2 + 5x^4,$$
a positive quadratic in $x^2$ with a negative discriminant $-frac554$, and hence is strictly positive everywhere. This means $p$ is strictly increasing, as required.
Take an initial iterate $x_0 = 1$. Then,
beginalign*
x_1 &= 1 - fracp(1)p'(1) = 1 - frac21 = -1 \
x_2 &= -1 - fracp(-1)p'(-1) = -1 - frac-21 = 1,
endalign*
and so the iterates repeat.
Method
It's not difficult to form a cycle with Newton's method. I wanted a polynomial that passes through $(-1, -2)$ and $(1, 2)$, both with derivative $1$. Following the tangent at $(-1, -2)$ to the $x$-axis will yield $x = 1$, so if we take $x_n = -1$, then $x_n+1 = 1$. Following the tangent at $(1, 2)$ yields an $x$-intercept of $x = -1$, so $x_n+2 = -1$, and so the iterates repeat.
I tried for a degree $5$ polynomial. Let our polynomial be
$$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + a_5 x^5.$$
Our restrictions turn into
beginalign*
p(-1) = a_0 - a_1 + a_2 - a_3 + a_4 - a_5 &= -2 \
p'(-1) = a_1 - 2a_2 + 3a_3 - 4a_4 + 5a_5 &= 1 \
p(1) = a_0 + a_1 + a_2 + a_3 + a_4 + a_5 &= 2 \
p'(-1) = a_1 + 2a_2 + 3a_3 + 4a_4 + 5a_5 &= 1.
endalign*
Solving this, we get a general solution in terms of $s$ and $t$:
$$p(x) = frac52x - frac12x^3 + t(1 - 2x^2 + x^4) + s(x - 2x^3 + x^5).$$
In particular, I needed to choose $s$ and $t$ so that the derivative
beginalign*
p'(x) &= frac52 - frac32x^2 + t(3x^3 - 4x) + s(1 - 6x^2 + 5x^4) > 0
endalign*
Clearly, we required $s > 0$. I also decided to chose $t = 0$; it may not have been necessary, but it made things simpler. Now $p'(x)$ is now a cubic in $x^2$:
$$p'(x) = left(frac52 + sright) - left(frac32 + 6sright)x^2 + 5sx^4.$$
I wanted the discriminant to be negative, which is to say
$$left(frac32 + sright)^2 - 20sleft(frac52 + sright) < 0.$$
Choosing $s = 1$ did the trick, and gave us the previously presented polynomial.
$endgroup$
1
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
add a comment |
$begingroup$
No, not necessarily. In particular, consider the polynomial,
$$p(x) = frac72x - frac52x^3 + x^5.$$
Note that
$$p'(x) = frac72 - frac152x^2 + 5x^4,$$
a positive quadratic in $x^2$ with a negative discriminant $-frac554$, and hence is strictly positive everywhere. This means $p$ is strictly increasing, as required.
Take an initial iterate $x_0 = 1$. Then,
beginalign*
x_1 &= 1 - fracp(1)p'(1) = 1 - frac21 = -1 \
x_2 &= -1 - fracp(-1)p'(-1) = -1 - frac-21 = 1,
endalign*
and so the iterates repeat.
Method
It's not difficult to form a cycle with Newton's method. I wanted a polynomial that passes through $(-1, -2)$ and $(1, 2)$, both with derivative $1$. Following the tangent at $(-1, -2)$ to the $x$-axis will yield $x = 1$, so if we take $x_n = -1$, then $x_n+1 = 1$. Following the tangent at $(1, 2)$ yields an $x$-intercept of $x = -1$, so $x_n+2 = -1$, and so the iterates repeat.
I tried for a degree $5$ polynomial. Let our polynomial be
$$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + a_5 x^5.$$
Our restrictions turn into
beginalign*
p(-1) = a_0 - a_1 + a_2 - a_3 + a_4 - a_5 &= -2 \
p'(-1) = a_1 - 2a_2 + 3a_3 - 4a_4 + 5a_5 &= 1 \
p(1) = a_0 + a_1 + a_2 + a_3 + a_4 + a_5 &= 2 \
p'(-1) = a_1 + 2a_2 + 3a_3 + 4a_4 + 5a_5 &= 1.
endalign*
Solving this, we get a general solution in terms of $s$ and $t$:
$$p(x) = frac52x - frac12x^3 + t(1 - 2x^2 + x^4) + s(x - 2x^3 + x^5).$$
In particular, I needed to choose $s$ and $t$ so that the derivative
beginalign*
p'(x) &= frac52 - frac32x^2 + t(3x^3 - 4x) + s(1 - 6x^2 + 5x^4) > 0
endalign*
Clearly, we required $s > 0$. I also decided to chose $t = 0$; it may not have been necessary, but it made things simpler. Now $p'(x)$ is now a cubic in $x^2$:
$$p'(x) = left(frac52 + sright) - left(frac32 + 6sright)x^2 + 5sx^4.$$
I wanted the discriminant to be negative, which is to say
$$left(frac32 + sright)^2 - 20sleft(frac52 + sright) < 0.$$
Choosing $s = 1$ did the trick, and gave us the previously presented polynomial.
$endgroup$
1
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
add a comment |
$begingroup$
No, not necessarily. In particular, consider the polynomial,
$$p(x) = frac72x - frac52x^3 + x^5.$$
Note that
$$p'(x) = frac72 - frac152x^2 + 5x^4,$$
a positive quadratic in $x^2$ with a negative discriminant $-frac554$, and hence is strictly positive everywhere. This means $p$ is strictly increasing, as required.
Take an initial iterate $x_0 = 1$. Then,
beginalign*
x_1 &= 1 - fracp(1)p'(1) = 1 - frac21 = -1 \
x_2 &= -1 - fracp(-1)p'(-1) = -1 - frac-21 = 1,
endalign*
and so the iterates repeat.
Method
It's not difficult to form a cycle with Newton's method. I wanted a polynomial that passes through $(-1, -2)$ and $(1, 2)$, both with derivative $1$. Following the tangent at $(-1, -2)$ to the $x$-axis will yield $x = 1$, so if we take $x_n = -1$, then $x_n+1 = 1$. Following the tangent at $(1, 2)$ yields an $x$-intercept of $x = -1$, so $x_n+2 = -1$, and so the iterates repeat.
I tried for a degree $5$ polynomial. Let our polynomial be
$$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + a_5 x^5.$$
Our restrictions turn into
beginalign*
p(-1) = a_0 - a_1 + a_2 - a_3 + a_4 - a_5 &= -2 \
p'(-1) = a_1 - 2a_2 + 3a_3 - 4a_4 + 5a_5 &= 1 \
p(1) = a_0 + a_1 + a_2 + a_3 + a_4 + a_5 &= 2 \
p'(-1) = a_1 + 2a_2 + 3a_3 + 4a_4 + 5a_5 &= 1.
endalign*
Solving this, we get a general solution in terms of $s$ and $t$:
$$p(x) = frac52x - frac12x^3 + t(1 - 2x^2 + x^4) + s(x - 2x^3 + x^5).$$
In particular, I needed to choose $s$ and $t$ so that the derivative
beginalign*
p'(x) &= frac52 - frac32x^2 + t(3x^3 - 4x) + s(1 - 6x^2 + 5x^4) > 0
endalign*
Clearly, we required $s > 0$. I also decided to chose $t = 0$; it may not have been necessary, but it made things simpler. Now $p'(x)$ is now a cubic in $x^2$:
$$p'(x) = left(frac52 + sright) - left(frac32 + 6sright)x^2 + 5sx^4.$$
I wanted the discriminant to be negative, which is to say
$$left(frac32 + sright)^2 - 20sleft(frac52 + sright) < 0.$$
Choosing $s = 1$ did the trick, and gave us the previously presented polynomial.
$endgroup$
No, not necessarily. In particular, consider the polynomial,
$$p(x) = frac72x - frac52x^3 + x^5.$$
Note that
$$p'(x) = frac72 - frac152x^2 + 5x^4,$$
a positive quadratic in $x^2$ with a negative discriminant $-frac554$, and hence is strictly positive everywhere. This means $p$ is strictly increasing, as required.
Take an initial iterate $x_0 = 1$. Then,
beginalign*
x_1 &= 1 - fracp(1)p'(1) = 1 - frac21 = -1 \
x_2 &= -1 - fracp(-1)p'(-1) = -1 - frac-21 = 1,
endalign*
and so the iterates repeat.
Method
It's not difficult to form a cycle with Newton's method. I wanted a polynomial that passes through $(-1, -2)$ and $(1, 2)$, both with derivative $1$. Following the tangent at $(-1, -2)$ to the $x$-axis will yield $x = 1$, so if we take $x_n = -1$, then $x_n+1 = 1$. Following the tangent at $(1, 2)$ yields an $x$-intercept of $x = -1$, so $x_n+2 = -1$, and so the iterates repeat.
I tried for a degree $5$ polynomial. Let our polynomial be
$$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + a_5 x^5.$$
Our restrictions turn into
beginalign*
p(-1) = a_0 - a_1 + a_2 - a_3 + a_4 - a_5 &= -2 \
p'(-1) = a_1 - 2a_2 + 3a_3 - 4a_4 + 5a_5 &= 1 \
p(1) = a_0 + a_1 + a_2 + a_3 + a_4 + a_5 &= 2 \
p'(-1) = a_1 + 2a_2 + 3a_3 + 4a_4 + 5a_5 &= 1.
endalign*
Solving this, we get a general solution in terms of $s$ and $t$:
$$p(x) = frac52x - frac12x^3 + t(1 - 2x^2 + x^4) + s(x - 2x^3 + x^5).$$
In particular, I needed to choose $s$ and $t$ so that the derivative
beginalign*
p'(x) &= frac52 - frac32x^2 + t(3x^3 - 4x) + s(1 - 6x^2 + 5x^4) > 0
endalign*
Clearly, we required $s > 0$. I also decided to chose $t = 0$; it may not have been necessary, but it made things simpler. Now $p'(x)$ is now a cubic in $x^2$:
$$p'(x) = left(frac52 + sright) - left(frac32 + 6sright)x^2 + 5sx^4.$$
I wanted the discriminant to be negative, which is to say
$$left(frac32 + sright)^2 - 20sleft(frac52 + sright) < 0.$$
Choosing $s = 1$ did the trick, and gave us the previously presented polynomial.
answered Jul 1 at 7:57
Theo BenditTheo Bendit
24.9k1 gold badge25 silver badges61 bronze badges
24.9k1 gold badge25 silver badges61 bronze badges
1
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
add a comment |
1
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
1
1
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
Curiously enough, the periodic orbits that appear in Newton's method are often unstable and break due round-off errors. How wonderful is it to have a method that converges due to errors?
$endgroup$
– PierreCarre
Jul 1 at 9:48
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@TheoBendit Thanks a lot for your answer as well as the excellent explanation of the construction!
$endgroup$
– flawr
Jul 1 at 12:41
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@PierreCarre That is interesting, so far when studying the theory I only ever found the counterexamples that are based on periodicity - do you know whether adding "errors" has been studied for actual use?
$endgroup$
– flawr
Jul 1 at 12:44
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr I came across this curiosity in the framework of discrete dynamical systems, where Newton's method is often used as an example of an iteration map. To my knowledge, this has not been used outside this context.
$endgroup$
– PierreCarre
Jul 1 at 12:59
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
$begingroup$
@flawr In this example you can check that the orbit is in fact unstable, starting with $x_0=1+varepsilon$ you will get convergence to zero.
$endgroup$
– PierreCarre
Jul 1 at 13:07
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3279513%2fconvergence-of-newton-method-for-monotonic-polynomials%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Wlog let the zero be at $0$. If there is a $p$ such that it does not converge, then there must exist $x$ such that $p(x)/p'(x) >= 2x$. Wlog we can choose $x=1$ and we get: If there is $p$ such that it does not converge, then $p(1)/p'(1) geq 2$. But I couldn't show yet that under the given assumptions $p(1)/p'(1) geq 2$ can never be satisfied.
$endgroup$
– araomis
Jul 1 at 8:07
$begingroup$
@flawr The conditions you mention do not prevent the formation of periodic orbits (and the conseguent divergence). If you want to guarantee convergence, you must pose additional conditions, like convexity/concavity, and even this does not guarantee convergence for all $x_0$.
$endgroup$
– PierreCarre
Jul 1 at 9:53