Least squares is a mathematical optimization technique that attempts to find a "best fit" to a set of data by attempting to minimize the sum of the squares of the differences (called residuals) between the fitted function and the data. It is commonly used in curve fitting. Many other optimization problems can also be expressed in a least squares form, either minimizing energy or maximizing entropy. See linear regression and GaussMarkov theorem. The GaussMarkov theorem says that leastsquares estimators are in a certain sense optimal. To use the method of least squares we use a function f(x), containing some number of unknown constants (for instance f(x) = mx + b, where m and b are not yet known), and find the values of m and b that minimize the sum of the squares of the residuals (that is, the sum of terms of the form (y_{i} − f(x_{i}))^{2}). We then have the equation for the curve, y = f(x), of the required form, that best fits the data points (x_{i}, y_{i}). For linear functions f see linear least squares. For nonlinear functions see Optimization, GaussNewton algorithm, LevenbergMarquardt algorithm.
External links  http://www.physics.csbsju.edu/stats/least_squares.html
 http://www.zunzun.com
 http://www.orbitals.com/self/least/least.htm
