Talk:Minimum mean square error

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Article name[edit]

I've chosen to name this article minimum mean-square error since is seems to be the most frequent form encountered (on the web). Minimum mean-squared error only get appoximately half the hits when searching the web using google. Redirects from minimum mean-squared error, minimum mean square error and minimum mean squared error have been added to collect the most common forms of spelling. --Fredrik Orderud 00:42, 19 Apr 2005 (UTC)

I've moved the article from Minimum mean square error to Minimum mean-square error since removing the dash makes the article title consistent with Mean square error, one of the provided references, and my Probability and Statistics textbook. ~MDD4696 13:42, 3 May 2007 (UTC)[reply]

Cleanup needed[edit]

I think this article needs a rewrite:

  • MMSE is basically a Bayesian concept, since from a frequentist point of view there is no single minimum MSE estimator. This should be made clear right from the top.
  • Relative efficiency is defined but never used. Is this relevant here?
  • I was unable to understand the meaning or relevance of the discussion in the "Operational Considerations" section. Again, this article should deal only with the Bayesian viewpoint, with maybe a short reference and link to competing frequentist methods like UMVU estimators.
  • The numeric example is nice, but it doesn't explain the underlying concepts. Three important points which can be illustrated with such an example are:
    • The orthogonality principle
    • The fact that the MMSE estimator is linear in the Gaussian case
    • The general formula for a linear MMSE estimator

I'm thinking of making some of these changes at some point in the near future. Any comments/suggestions before I go ahead and do it? --Zvika (talk) 09:05, 8 January 2008 (UTC)[reply]

I have completed the cleanup according to the above points, to the best of my ability and understanding. Any comments are welcome. --Zvika (talk) 19:08, 21 January 2008 (UTC)[reply]

I disagree. It is not just a Bayesian concept. I wrote a book on this a long time ago [1]. There are lots of examples: the simplest is where coefficient of variation is known or can be estimated. Other examples include Stein estimation and Ridge regression.

The statement "The fact that the MMSE estimator is linear in the Gaussian case" surely shows a frequentist perspective? It is however incorrect IMHO. The 'proof' in the article relates to 'unbiassed estimators. However, as it relates to Gaussian data it is not really relevant to the general issue.

Johnbibby (talk) 21:53, 27 February 2008 (UTC)[reply]
Both Stein estimation and ridge regression are frequentist techniques. They are not MMSE -- that they do not achieve the lowest possible MSE. Perhaps our dispute is on wording: the James-Stein estimator is designed to reduce the MSE (compared with LS); it does not bring the MSE to a minimum, though. Indeed, in the frequentist setting you cannot minimize the MSE because improving the MSE for some values of the unknown parameter will invariably deteriorate performance for other values. So there is no unique minimum MSE.
I do not see why you think that my quote about Gaussianity requires a frequentist perspective. The statement itself is correct and can be more accurately stated as follows: "In the jointly Gaussian case the MMSE is linear in the data" [Kay, Statistical Signal Processing, vol.1, p.350). There is no proof (with or without quotes) in the article, so I'm not really sure what exactly you're referring to. --Zvika (talk) 10:10, 28 February 2008 (UTC)[reply]

Probability Contention[edit]

The use of terms such as minimum error and unbiased are hotly contended subjects, and

  • It should not, therefore, be removed from this page so that readers from both viewpoints don't start arguing to no end.
  • The purpose of Operational Considerations was to clarify how each school of thought actually does the integral.
    • Frequentists use the prior distribution of the statistic.
    • Bayesians use the posterior distribution of the parameter.
  • The name has been changed to Bayesian vs. Frequentist Perspectives and the section moved to the end of the article.
  • The discussion mostly followed Jaynes. Frobnitzem (talk) 20:25, 5 February 2008 (UTC)[reply]

Sorry, but I disagree completely. MVU and MMSE are entirely different concepts, as User:Michael Hardy correctly explained on Talk:Minimum-variance unbiased estimator. Furthermore, I don't think that this assertion is "hotly contended"; if you think otherwise, please provide a more accurate reference. I am not sure what Jaynes you are referring to; if you mean the book Probability Theory: The Logic of Science, then I could not find a reference to MMSE in the index. --Zvika (talk) 20:01, 5 February 2008 (UTC)[reply]

http://omega.albany.edu:8008/ETJ-PS/cc16u.ps pp. 9-10 And yes, MSE is a predominant concept in both camps, so it is not impossible for a frequentist to use the term minimum MSE -- especially in reference to estimator efficiency. Also, the use of the terms is hotly contended as implied above. Frobnitzem (talk) 20:25, 5 February 2008 (UTC)[reply]

The term MSE is used in both a frequentist and Bayesian context, but means different things. In the frequentist context, the MSE is a function of θ, so it is not possible to talk about a minimum MSE (since one can never minimize the MSE for all θ simultaneously). Thus, in this article, which discusses minimum MSE, there is no possible frequentist interpretation. This does not imply anything about the validity of the frequentist point of view (which is an entirely legitimate point of view); it just says that such a point of view belongs elsewhere (e.g., the article on mean squared error).
If you can provide a reliable source discussing the term "minimum MSE" in a frequentist context, then it should be placed here. Otherwise, I don't see why you think this term is "hotly contended", so I don't see why you have restored the old version.
Finally, the link you placed above does not mention the term MSE at all. --Zvika (talk) 09:03, 6 February 2008 (UTC)[reply]
If you want to delete my discussion on the differences between the MSE and MMSE, go right ahead. I need debate on this subject like I need a hole in the head. Frobnitzem (talk) 18:48, 6 February 2008 (UTC)[reply]
Well, if you don't have the patience to write clearly and to cite your sources, you shouldn't be surprised if your edits get reverted. --Zvika (talk) 20:14, 6 February 2008 (UTC)[reply]

External links modified (February 2018)[edit]

Hello fellow Wikipedians,

I have just modified one external link on Minimum mean square error. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, you may follow the instructions on the template below to fix any issues with the URLs.

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 18 January 2022).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 02:23, 1 February 2018 (UTC)[reply]