Talk:Asymptotic expansion

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Some points of organisation[edit]

For an asymptotic scale, I also know the definition of an (arbitrary) family of functions S={fm} such that

Then this induces an evident ordering on the set of indices, and asymptotic expansions are defined in the same way, as finite sums on m m' , such that the difference is negligible w.r.t. all m<m' .

Without being too general, we should allow at least for negative indices, e.g. for Laurent series, but maybe there is really a need for an indexing set other than N, in order to be able to develop on the scale xa(log x)b. for example. MFH: Talk 14:00, 24 May 2005 (UTC)[reply]

Yes. There is quite a big subject - orders of infinity. Charles Matthews 11:38, 25 May 2005 (UTC)[reply]

the definition[edit]

An asymptotic expansion is NOT a series but a polynomial approximation in a neighborhood of a point so that the function is equal to its polynomial approximation plus a remainder. For instance Taylor's formula with remainder is an asymptotic expansions but it often happens, in number theory for instance, that asymptotic expansions with only one or two terms are known.

Edmund Landau's Little oh notation is often used to give the size of the remainder. See Apostol p370

See for instance http://fr.wikipedia.org/wiki/Développement_limité

Examples of asymptotic expansions include the Best Constant Approximation which leads to continuity, Best Affine Approximation which leads to differentiability, Best Quadratic Approximation which leads to second differentiability, etc. In general, this leads to Lagrange's treatment of the differential calculus without recourse to limits.

Assimilating expansions and series is a VERY SERIOUS ERROR with much potential for confusion.

Schremmer (talk) 19:26, 21 November 2011 (UTC)[reply]

I think that

is unnecessarily limiting. I think this is enough:

If you agree, pls change it. --Zero 14:58, 22 August 2005 (UTC)[reply]


Just a point on clarity should the definition refer to these as a `class' of functions instead of a `series'? just what we currently have

`is a formal series of functions which has the property that truncating the series after a finite number of terms'

seems not to make sense and that what we actually mean is that asymptotic functions are a class of functions that truncate series expansion after a finite number of terms. 81.106.130.224 (talk) 14:53, 3 May 2009 (UTC)[reply]

differentiation and integration[edit]

My question is if we have

then is correct that:

or

In several books i have read that this is true, however i'm not completely sure though. —The preceding unsigned comment was added by 85.85.100.144 (talk) 09:52, 11 January 2007 (UTC).[reply]

The first one is certainly not true: imagine that 'g' is the same as 'f' except that it has very fine wriggles that get smaller quickly but not flatter. The second one might be true most of the time, perhaps with some sanity conditions required. --Zerotalk 12:09, 11 January 2007 (UTC)[reply]

another example[edit]

Stirling's approximation is another important example - should it be added to the main page? Lavaka 00:23, 15 February 2007 (UTC)[reply]

derivation of asymptotic expansion of error function[edit]

Here is a derivation of the asymptotic expansion of the error function (PDF-Proposition 2.10)136.142.141.195 (talk) 00:09, 9 April 2008 (UTC)[reply]

halting summation[edit]

Sometimes books say we truncate an asymptotic series "when it begins to diverge". What does this mean? If you know exactly what this means could you add it to the article? But if you just have a rough idea I would appreciate if you could explain here on the talk page. 99.233.20.151 (talk) 04:18, 24 April 2008 (UTC)[reply]

It means "stop summing the series when you get to the smallest term, and do not continue summing after that". Remarkably, this often gives the very best approximation for whatever thing it is that you are trying to compute, and the remaining error is often smaller the smallest term (at which you stopped summing). Although this is "common knowledge" and regularly used in numerical applications, I have to admit I have never seen a proof or discussion that explains why this works. It would be nice to have a reference and a proof sketch for this. linas (talk) 03:42, 31 January 2010 (UTC)[reply]

Two problems with the "formal definition" section[edit]

  • Everything is specified to be continuous, why?
  • "In contrast to a convergent series for , wherein the series converges for any fixed in the limit , one can think of the asymptotic series as converging for fixed in the limit (with possibly infinite)." No problem with the first part, but the second part does not follow from the definition. An additional assumption, such as as for each is needed. It is often true but it doesn't have to be.

McKay (talk) 03:28, 3 March 2023 (UTC)[reply]