Talk:Newton polynomial

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Can we get an algorithm here? —Preceding unsigned comment added by 24.199.203.189 (talk) 20:57, 17 February 2008 (UTC)[reply]

I did a complete rewrite of the article to make it clearer and to use the same style, notation as in polynomial interpolation and Lagrange polynomial. MathMartin 13:16, 17 Aug 2004 (UTC)

Adding programming example[edit]

User http://en.wikipedia.org/wiki/Special:Contributions/P000010110 deleted some Octave code on this page a while ago. The reason was that wikipedia is not a programming tutorial. However, for computer scientists learning about this topic, this kind of contributions greatly increase the understanding in the topic. We are used to program every day and some programming code will help us to understand the topic much faster than mathematical symbols. — Preceding unsigned comment added by 91.176.97.127 (talk) 12:15, 4 August 2012 (UTC)[reply]

Numerical Precision in Example[edit]

It's probably worth pointing out that the example provided is only worked out to very low precision. If someone takes that example and works it out on a computer with double precision (or probably even single precision), they will quickly see that the constant and quadratic terms in the interpolating polynomial are 0. Intuitively, this should make sense because the tangent function is odd and the data points are symmetric about x = 0. —Preceding unsigned comment added by 151.200.114.163 (talk) 21:15, 24 October 2007 (UTC)[reply]

Right. I added a comment that the computation uses only six digits. -- Jitse Niesen (talk) 12:08, 1 April 2008 (UTC)[reply]

Looking at a plot of the function, it appears that all the coefficient are wrong. It is obvious that the coefficient for x^1 should be positive and somewhere close to 1. —Preceding unsigned comment added by 24.58.11.37 (talk) 01:37, 1 April 2008 (UTC)[reply]

If the interpolation points are close together, then the coefficient for x^1 is indeed close to 1. However, here the space between the points is fairly wide so there is a large error. I computed the interpolating polynomial myself to check and the end result is correct (to the limited precision of the computation, of course). -- Jitse Niesen (talk) 12:08, 1 April 2008 (UTC)[reply]

Comments on article[edit]

Contrary to frequent claims, Lagrange interpolation (in a "barycentric" form) _does_ allow calculation of the next degree of polynomial interpolation without re-doing previous calculations. This requires recording the value of each term in the calculation.

Though it's often said that Lagrange interpolation isn't used as a practical method, I've read that some _do_ use it for practical interpolation, and some advocate it for that purpose. Speaking for myself, I've used Lagrange for problems in which the interpolation is always at the same X-value, with only the y-values changing, and in which it's known in advance, from experience, how many terms are needed for sufficient accuracy.

Of course, when it it isn't known how many terms will be needed for sufficient accuracy, divided-difference methods are the ones of choice.

A frequent use of polynomial interpolation is to determine whether or not linear interpolation will be accurate enough. ...by evaluating the quadratic term of a divided-difference method (to find out if it's big enough to matter in the problem). Of course only a divided-difference method can be used for that determination.

Among the divided-difference formulas, Netwon's usually isn't as practical as those of Stirling, Bessel and Gauss, because the more points it adds at one end, the farther the center of the points moves. Since, by assumption, you don't know in advance how many points you'll need, you also don't know where their center will eventually be. If the center turns out to be far from the evaluated point, then accuracy will be lost, because polynomial interpolation's accuracy is best when the interpolated point is near the middle of the data points being used. Therefore, Newton's formula is of interest mainly as the finite-differences version of Taylor's polynomial, from which the other divided-difference methods can be derived. —Preceding unsigned comment added by JoinedOct1309 (talkcontribs) 01:39, 6 November 2009 (UTC)[reply]

Discussion[edit]

On 1.1/2, "Newton forward/backward divided difference formula", what is the justification for using binomial coefficients, when "s" is, in general, not an integer? Pasmao (talk) 05:12, 30 April 2022 (UTC)[reply]

Definition section too long[edit]

The definition section covers more than the definition and is certainly too long. The long text part following the formulas needs to be structured!82.130.71.230 (talk) 19:28, 1 February 2010 (UTC)[reply]

Matlab algorithm[edit]

My algorithm is more compact

function [fp,ko] = newton_interpolation(TA)
n = size(TA);
n = n(2);
FX = zeros(n,n);
FX(1,:) = TA(2,:);
H=TA(2,1);
syms x;
k1=n;
for i = 2:n
    for j = 1:n+1-i
        FX(i,j) = (FX(i-1,j+1)-FX(i-1,j))/(TA(1,j+i-1)-TA(1,j));
        k1 = k1 + 1;
    end
    k = 1;
    for j = 1:i-1
        k = k * (x - TA(1, j));
    end
    H = H + k * FX(i,1);
end
y = simplify(H);
fp = sym2poly(y);
ko = k1;

--Ignotusp (talk) 21:40, 20 December 2010 (UTC)[reply]

Proof[edit]

A proof of the recursion formula for the coefficient might make this article complete. As with other Wikipedia articles I was thinking the proof can be hidden. Here's an elegant proof I found: http://www.maths.lancs.ac.uk/~gilbert/m243a/node6.html — Preceding unsigned comment added by 128.111.185.31 (talk) 23:14, 12 September 2011 (UTC)[reply]

An Advantage of the Newton Form of the Interpolating Polynomial[edit]

The article contains a section on the "Strengths and weaknesses of various formulae." I want to suggest an additional strength of the Newton form of the interpolating polynomial.

Given the Lagrange form or the Newton form one can combine the terms to find the coefficients of the polynomial. With the Lagrange form I found it too difficult to develop an algorithm for this for the general case. However, with the Newton form I found a compact and effective algorithm. See below.

 let n = the number of points - 1
 let array x be initialized with the x values of the given points
 let array c be initialized with the y values of the given points

// Construct the backward divided differences table. The following is from Algorithm 4.3 of Conte and de Boor's Elementary Numerical Analysis, second edition (page 204). "Calculation of the coefficients for the Newton form."

 for(j=1; j<=n; j++) {
   for(i=0; i<=n-j; i++) {
     if(x[i+j]==x[i]) {
       fprintf(stderr,"\nError *** duplicate x values encountered\n\n");
       return;
     }
     c[i]=(c[i+1]-c[i])/(x[i+j]-x[i]);
   }
 }

// Now f(a) can be determined for a given value a as follows.

 fa=c[0];
 for(i=1; i<=n; i++) {
   fa=fa*(a-x[i])+c[i];
 }

// The following is Jeff Stetekluh's method for combining the terms to form the coefficients of the polynomial.

 for(j=0; j<n; j++) {
   for(i=1; i<=n-j; i++) {
     c[i]=c[i]-x[i+j]*c[i-1];
   }
 }

// Array c now holds the coefficients of the polynomial. Its first entry c[0] is the coefficient of the term of highest degree and c[n] is the constant term. Therefore f(a) can be determined for a given value a as follows.

 fa=c[0];
 for(i=1; i<=n; i++) {
   fa=fa*a+c[i];
 }

Answers to template messages at top of article[edit]

At the top of the article, it says:

"This article needs additional citations for verification. (September 2014")

Answer: Yes, when I initially posted my material here, I didn't realize the need to cite my source. But recently I've added a citation of the source of the information that I posted (at the place where "citation needed" was written).

"This article is written like a personal reflection or opinion essay that states a Wikipedia editor's personal feelings about a topic."

Answer: My material here, about the differences among, relative advantages of, and particular uses of, various polynomial interpolation formulas, consists of objective facts supported by the source that I cited (Numerical methods for Scientists and Engineers, by R.W. Hamming).

It isn't a personal reflection, opinion essay, or expression of personal feelings. — Preceding unsigned comment added by 68.186.18.145 (talk) 18:32, 30 April 2018 (UTC)[reply]

Would anyone object if I remove the two template messages quoted above? (...the first one because the problem it refers to has been remedied, and the second one because the statements that it refers to are citation-supported objective facts, and not expression of personal opinions or feelings).

I'd like to remove those two template messages if there are no objections, to that proposed removal, during the coming 30 days.


May 1,2018, 04:19 (UTC)

Actually, it would probably be better if I only removed the template-message referring to alleged expression of personal opinion and feelings about the topic...because the part about need for more citation could be referring to any part of the article. — Preceding unsigned comment added by 68.186.18.145 (talk) 04:21, 1 May 2018 (UTC)[reply]


Last month I invited objections to the deletion of the maintenance-template that said: "This article is written like a personal reflection or opinion essay that states a Wikipedia editor's personal feelings about a topic."

In the month since Ii made that request/invitation, no one has objected to the deletion. So I deleted that maintenance-template today.

As I said here last month:

"My material here, about the differences among, relative advantages of, and particular uses of, various polynomial interpolation formulas, consists of objective facts supported by the source that I cited (Numerical methods for Scientists and Engineers, by R.W. Hamming)." — Preceding unsigned comment added by 68.186.18.145 (talk) 02:44, 8 June 2018 (UTC)[reply]