Least squares is a mathematical optimization technique that attempts to find a "best fit" to a set of data by attempting to minimize the sum of the squares of the differences (called residuals) between the fitted function and the data.
It is commonly used in curve fitting. Many other optimization problems can also be expressed in a least squares form, either minimizing energy or maximizing entropy.
See linear regression and Gauss-Markov theorem. The Gauss-Markov theorem says that least-squares estimators are in a certain sense optimal.
To use the method of least squares we use a function f(x), containing some number of unknown constants (for instance f(x) = mx + b, where m and b are not yet known), and find the values of m and b that minimize the sum of the squares of the residuals (that is, the sum of terms of the form (yi − f(xi))2). We then have the equation for the curve, y = f(x), of the required form, that best fits the data points (xi, yi).
Share your thoughts, questions and commentary here
Want to know more? Search encyclopedia, statistics and forums:
Press Releases |
The Wikipedia article included on this page is licensed under the
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m