FACTOID # 19: Cheap sloppy joes: Looking for reduced-price lunches for schoolchildren? Head for Oklahoma!

 Home Encyclopedia Statistics States A-Z Flags Maps FAQ About

 WHAT'S NEW

SEARCH ALL

Search encyclopedia, statistics and forums:

(* = Graphable)

Encyclopedia > Statistical method
An example of statistics used in educational assessment. Compares the various grading methods in a normal distribution. Includes: Standard deviations, cummulative precentages, percentile equivalents, Z-scores, T-scores, standard nine, percent in stanine.

Key concepts and terms of statistics assume probability theory; among the terms are: population, sample, sampling, sampling unit and probability. Warning: systems are known to science that violate probability theory empirically. Probability theory is the mathematical study of probability. ...

Once data has been collected, either through a formal sampling procedure or by recording responses to treatments in an experimental setting (cf experimental design), or by repeatedly observing a process over time (time series), graphical and numerical summaries may be obtained using descriptive statistics. Sampling is that part of statistical practice concerned with the selection of individual observations intended to yield some knowledge about a population of concern, especially for the purposes of statistical inference. ... In the scientific method, an experiment is a set of actions and observations, performed to support or falsify a hypothesis or research concerning phenomena. ... The first statistician to consider a methodology for the design of experiments was Sir Ronald A. Fisher. ... In statistics and signal processing, a time series is a sequence of data points, measured typically at successive times, spaced apart at uniform time intervals. ... Descriptive statistics is a branch of statistics that denotes any of the many techniques used to summarize a set of data. ...

Patterns in the data are modeled to draw inferences about the larger population, using inferential statistics to account for randomness and uncertainty in the observations. These inferences may take the form of answers to essentially yes/no questions (hypothesis testing), estimates of numerical characteristics (estimation), prediction of future observations, descriptions of association (correlation), or modeling of relationships (regression). Model may refer to more than one thing : For models in society, art, fashion, and cosmetics, see; role model model (person) supermodel figure drawing modeling section In science and technology, a model (abstract) is understood as an abstract or theoretical representation of a phenomenon,see; geologic modeling model (economics) model... It has been suggested that this article or section be merged with statistical inference. ... In ordinary language, the word random is used to express apparent lack of purpose or cause. ... One may be faced with the problem of making a definite decision with respect to an uncertain hypothesis which is known only through its observable consequences. ... Estimation is generally the calculation of an approximate or uncertain result, often based on approximate, uncertain, incomplete, or noisy data. ... A prediction is a statement or claim that a particular event will occur in the future. ... In probability theory and statistics, correlation, also called correlation coefficient, is a numeric measure of the strength of linear relationship between two random variables. ... Generally, regression is a move backwards; It is the opposite of progress. ...

The framework described above is sometimes referred to as applied statistics. In contrast, mathematical statistics (or simply statistical theory) is the subdiscipline of applied mathematics which uses probability theory and analysis to place statistical practice on a firm theoretical basis. Applied statistics is the use of statistics and statistical theory in real-life situations. ... Mathematical statistics uses probability theory and other branches of mathematics to study statistics from a purely mathematical standpoint. ... The theory of statistics includes a number of topics: Statistical models of the sources of data and typical problem formulation: Sampling from a finite population Measuring observational error and refining procedures Studying statistical relations Planning statistical research to measure and control observational error: Design of experiments to determine treatment effects... Applied mathematics is a branch of mathematics that concerns itself with the application of mathematical knowledge to other domains. ... Probability theory is the mathematical study of probability. ... Analysis is the generic name given to any branch of mathematics which depends upon the concepts of limits and convergence, and studies closely related topics such as continuity, integration, differentiability and transcendental functions. ...

The word statistics is also the plural of statistic (singular), which refers to the result of applying a statistical algorithm to a set of data. A statistic (singular) is the result of applying a statistical algorithm to a set of data. ...

## Statistical methods

### Experimental and observational studies

A common goal for a statistical research project is to investigate causality, and in particular to draw a conclusion on the effect of changes in the values of predictors or independent variables on a response or dependent variable. There are two major types of causal statistical studies, experimental studies and observational studies. In both types of studies, the effect of changes of an independent variable (or variables) on the behavior of the dependent variable are observed. The difference between the two types is in how the study is actually conducted. An independent variable is presumed to cause or determine a dependent variable. ... In experimental design, a dependent variable is a variable dependent on another variable (called the independent variable). ...

An experimental study involves taking measurements of the system under study, manipulating the system, and then taking additional measurements using the same procedure to determine if the manipulation may have modified the values of the measurements. In contrast, an observational study does not involve experimental manipulation. Instead data are gathered and correlations between predictors and the response are investigated.

An example of an experimental study is the famous Hawthorne studies which attempted to test changes to the working environment at the Hawthorne plant of the Western Electric Company. The researchers were interested in whether increased illumination would increase the productivity of the assembly line workers. The researchers first measured productivity in the plant then modified the illumination in an area of the plant to see if changes in illumination would affect productivity. Due to errors in experimental procedures, specifically the lack of a control group, the researchers while unable to do what they planned were able to provide the world with the Hawthorne effect. It has been suggested that this article or section be merged with Hawthorne effect. ... An assembly line is a manufacturing process in which interchangeable parts are added to a product in a sequential manner to create an end product. ... From Latin ex- + -periri (akin to periculum attempt). ... This article needs to be cleaned up to conform to a higher standard of quality. ...

An example of an observational study is a study which explores the correlation between smoking and lung cancer. This type of study typically uses a survey to collect observations about the area of interest and then perform statistical analysis. In this case, the researchers would collect observations of both smokers and non-smokers and then look at the number of cases of lung cancer in each group.

The basic steps for an experiment are to:

1. plan the research including determining information sources, research subject selection, and ethical considerations for the proposed research and method,
2. design the experiment concentrating on the system model and the interaction of independent and dependent variables,
3. summarize a collection of observations to feature their commonality by suppressing details (descriptive statistics),
4. reach consensus about what the observations tell us about the world we observe (statistical inference),
5. document and present the results of the study.

Most scientific work starts with a question about the world we live in. ... Ethics (from Greek ethikos) is the branch of axiology â€“ one of the four major branches of philosophy, alongside metaphysics, epistemology, and logic â€“ which attempts to understand the nature of morality; to define that which is right from that which is wrong. ... The first statistician to consider a methodology for the design of experiments was Sir Ronald A. Fisher. ... In descriptive statistics, summary statistics are used to summarize a set of observations, in order to communicate as much as possible as simply as possible. ... Descriptive statistics is a branch of statistics that denotes any of the many techniques used to summarize a set of data. ... The topics below are usually included in the area of interpreting statistical data. ... The topics below are usually included in the area of interpreting statistical data. ...

### Levels of measurement

There are four types of measurements or measurement scales used in statistics. The four types or levels of measurement (ordinal, nominal, interval, and ratio) have different degrees of usefulness in statistical research. Ratio measurements, where both a zero value and distances between different measurements are defined, provide the greatest flexibility in statistical methods that can be used for analysing the data. Interval measurements, with meaningful distances between measurements but no meaningful zero value (such as IQ measurements or temperature measurements in degrees Celsius). Ordinal measurements have imprecise differences between consecutive values but a meaningful order to those values. Nominal measurements have no meaningful rank order among values. The level of measurement of a variable in mathematics and statistics is a classifcation that was proposed in order to describe how much information the numbers associated with the variable contain. ... Research is an active, diligent, and systematic process of inquiry in order to discover, interpret and/or revise facts. ... The degree Celsius (Â°C) is a unit of temperature named after the Swedish astronomer Anders Celsius (1701â€“1744), who first proposed a similar system in 1742. ...

### Statistical techniques

Some well known statistical tests and procedures for research observations are: Look up test in Wiktionary, the free dictionary. ... A procedure is a series of activities, tasks, steps, decisions, calculations and other processes, that when undertaken in the sequence laid down produces the described result, product or outcome. ... Research is an active, diligent, and systematic process of inquiry in order to discover, interpret and/or revise facts. ... For the railroad use of the term observation, see observation car. ...

A t-test is any statistical hypothesis test in which the test statistic has a Students t-distribution if the null hypothesis is true. ... For any positive integer , the chi-square distribution with k degrees of freedom is the probability distribution of the random variable where Z1, ..., Zk are independent normal variables, each having expected value 0 and variance 1. ... In statistics, analysis of variance (ANOVA) is a collection of statistical models and their associated procedures which compare means by splitting the overall observed variance into different parts. ... The Mann-Whitney U test is one of the best-known non-parametric statistical significance tests. ... Regression analysis is any statistical method where the mean of one or more random variables is predicted conditioned on other (measured) random variables. ... In probability theory and statistics, correlation, also called correlation coefficient, is a numeric measure of the strength of linear relationship between two random variables. ... In mathematics, and in particular statistics, the Pearson product-moment correlation coefficient (r) is a measure of how well a linear equation describes the relation between two variables X and Y measured on the same object or organism. ... In statistics, Spearmans rank correlation coefficient, named for Charles Spearman and often denoted by the Greek letter ρ (rho), is a non-parametric measure of correlation – that is, it assesses how well an arbitrary monotonic function could describe the relationship between two variables, without making any assumptions about the...

## Probability

Statistics makes extensive use of the concept of probability. The probability of an event is often defined as a number between one and zero. In reality however there is virtually nothing that has a probability of 1 or 0. You could say that the sun will certainly rise in the morning, but what if an extremely unlikely event destroys the sun? What if there is a nuclear war and the sky is covered in ash and smoke? The word probability derives from the Latin probare (to prove, or to test). ... The Sun (or Sol) is the star at the center of our Solar system. ...

We often round the probability of such things up or down because they are so likely or unlikely to occur, that it's easier to recognize them as a probability of one or zero. In probability theory, Kolmogorovs zero-one law, named in honor of Andrey Nikolaevich Kolmogorov, specifies that a certain type of event, called a tail event, will either almost surely happen or almost surely not happen; that is, the probability of such an event occurring is zero or one. ...

However, this can often lead to misunderstandings and dangerous behaviour, because people are unable to distinguish between, e.g., a probability of 10−4 and a probability of 10−9, despite the very practical difference between them. If you expect to cross the road about 105 or 106 times in your life, then reducing your risk of being run over per road crossing to 10−9 will make it unlikely that you will be run over while crossing the road for your whole life, while a risk per road crossing of 10−4 will make it very likely that you will have an accident, despite the intuitive feeling that 0.01% is a very small risk.

Use of prior probabilities of 0 (or 1) causes problems in Bayesian statistics, since the posterior probability is then forced to be 0 (or 1) as well. In other words, the data are not taken into account at all! As Dennis Lindley puts it, if a coherent Bayesian attaches a prior probability of zero to the hypothesis that the Moon is made of green cheese, then even whole armies of astronauts coming back bearing green cheese cannot convince him. Lindley advocates never using prior probabilities of 0 or 1. He calls it Cromwell's rule, from a letter Oliver Cromwell wrote to the synod of the Church of Scotland on August 5th, 1650 in which he said "I beseech you, in the bowels of Christ, consider it possible that you are mistaken." A prior probability is a marginal probability, interpreted as a description of what is known about a variable in the absence of some evidence. ... Bayesian inference is statistical inference in which probabilities are interpreted not as frequencies or proportions or the like, but rather as degrees of belief. ... In Bayesian probability theory, the posterior probability is the conditional probability of some event or proposition, taking empirical data into account. ... Cromwells rule, named by statistician D. Lindley, states that one should avoid using prior probabilities of 0 or 1. ... Unfinished portrait miniature of Oliver Cromwell by Samuel Cooper, 1657. ...

## Important contributors to statistics

See also list of statisticians. Johann Carl Friedrich Gauss Johann Carl Friedrich Gauss (Gauß) (April 30, 1777 _ February 23, 1855) was a legendary German mathematician, astronomer and physicist with a very wide range of contributions; he is considered to be one of the greatest mathematicians of all time. ... Blaise Pascal (June 19, 1623â€“August 19, 1662) was a French mathematician, physicist, and religious philosopher. ... Francis Galton Sir Francis Galton F.R.S. (February 16, 1822 â€“ January 17, 1911) was a Victorian polymath, British anthropologist, eugenicist, tropical explorer, geographer, inventor, meteorologist, proto-geneticist, psychometrician, and statistician. ... William Sealy Gosset (June 13, 1876 – October 16, 1937) was a chemist and statistician, better known by his pen name Student. ... Karl Pearson (pencil sketch in notebook; there is some see-through of writing on next page) Karl Pearson (March 27, 1857 â€“ April 27, 1936) was a major contributor to the early development of statistics as a serious scientific discipline in its own right. ... Sir Ronald Fisher Sir Ronald Aylmer Fisher, FRS (17 February 1890 â€“ 29 July 1962) was a British eugenicist, evolutionary biologist, geneticist and statistician. ... Gertrude Mary Cox (1900-1978) was an influential American statistican and founder of the department of Experimental Statistics at North Carolina State University. ... Charles Edward Spearman (September 10, 1863 - September 7, 1945) was an English psychologist known for work in statistics, as a pioneer of factor analysis, and for Spearmans rank correlation coefficient. ... Pafnuty Lvovich Chebyshev Pafnuty Lvovich Chebyshev (ÐŸÐ°Ñ„Ð½ÑƒÌÑ‚Ð¸Ð¹ Ð›ÑŒÐ²Ð¾ÌÐ²Ð¸Ñ‡ Ð§ÐµÐ±Ñ‹ÑˆÑ‘Ð²) (May 16, 1821 - December 9, 1894) was a Russian mathematician. ... Aleksandr Mikhailovich Lyapunov (ÐÐ»ÐµÐºÑÐ°Ð½Ð´Ñ€ ÐœÐ¸Ñ…Ð°Ð¹Ð»Ð¾Ð²Ð¸Ñ‡ Ð›ÑÐ¿ÑƒÐ½Ð¾Ð²) (June 6, 1857 â€“ November 3, 1918, all new style) was a Russian mathematician, mechanician and physicist. ... Sir Isaac Newton, PRS (4 January [O.S. 25 December 1642] 1643 â€“ 31 March [O.S. 20 March] 1727) was an English physicist, mathematician, astronomer, alchemist, and natural philosopher who is regarded by many as the most influential scientist in history. ... Abraham de Moivre Abraham de Moivre (May 26, 1667 in Vitry-le-FranÃ§ois, Champagne, France â€“ November 27, 1754 in London, England) was a French mathematician famous for de Moivres formula, which links complex numbers and trigonometry, and for his work on the normal distribution and probability theory. ... This article needs to be wikified. ... Florence Nightingale, OM (12 May 1820 â€“ 13 August 1910), who came to be known as The Lady with the Lamp, was the pioneer of modern nursing. ... John Wilder Tukey (June 16, 1915 - July 26, 2000) was a statistician. ... George Bernard Dantzig (8 November 1914 â€“ 13 May 2005) was a mathematician who introduced the simplex algorithm and is considered the Father of linear programming. He was the recipient of many honors, including the National Medal of Science in 1975, the John von Neumann Theory Prize in 1974. ... Thomas Bayes Reverend Thomas Bayes (c. ... Statisticians or people who made notable contributions to the theories of statistics, or related aspects of probability, or machine learning: Peter Armitage M. S. Bartlett Thomas Bayes Allan Birnbaum David Blackwell Ladislaus Bortkiewicz Pafnuty Chebyshev Alexey Chervonenkis Richard Threlkeld Cox Harald CramÃ©r (Sweden, 1893 - 1985) Philip Dawid Mike Dugas...

## Specialized disciplines

Some sciences use applied statistics so extensively that they have specialized terminology. These disciplines include: Applied statistics is the use of statistics and statistical theory in real-life situations. ... Jargon redirects here. ...

Statistics form a key basis tool in business and manufacturing as well. It is used to understand measurement systems variability, control processes (as in statistical process control or SPC), for summarizing data, and to make data-driven decisions. In these roles it is a key tool, and perhaps the only reliable tool. Biostatistics is the application of statistics to biology. ... Business statistics is the science of ‘good decision making in the face of uncertainty and is used in many disciplines such as financial analysis, econometrics, auditing, production and operations including services improvement, and marketing research. ... Data mining, also known as knowledge-discovery in databases (KDD), is the practice of automatically searching large stores of data for patterns. ... Pattern recognition is a field within the area of machine learning. ... Econometrics literally means economic measurement. It is the branch of economics that applies statistical methods to the empirical study of economic theories and relationships. ... Engineering statistics is a branch of statistics that has two subtopics which are particular to engineering: Quality control and process control use statistics as a tool to manage conformance to specifications of manufacturing processes and their products. ... Statistical physics, one of the fundamental theories of physics, uses methods of statistics in solving physical problems. ... Demography is the study of human population dynamics. ... Psychological statistics is the application of statistics to psychology. ... Social statistics is the use of statistical measurement systems to study human behavior in a social environment. ... Wikipedia does not have an article with this exact name. ... Chemometrics is the application of mathematical or statistical methods to chemical data. ... Analytical chemistry is the analysis of material samples to gain an understanding of their chemical composition and structure. ... Chemical engineering is the application of science, in particular chemistry, along with mathematics and economics to the process of converting raw materials or chemicals into more useful or valuable forms. ... As with many sports, and perhaps even more so, statistics are very important to baseball. ... Cricket is a sport that generates a large number of statistics. ... Statistical process control (SPC) is a method for achieving quality control in manufacturing processes. ...

## Software

Modern statistics is supported by computers to perform some of the very large and complex calculations required.

Whole branches of statistics have been made possible by computing, for example neural networks. A neural network is an interconnected group of neurons. ...

The computer revolution has implications for the future of statistics, with a new emphasis on 'experimental' and 'empirical' statistics.

Statistical packages in common use include: A statistical package is a kind of large computer program that is specialised for statistical analysis. ...

 Open source or Freeware proprietary R programming language Statistics Online Computational Resource (UCLA) Mondrian (Software for Exploratory Data Analysis) GNU Octave GNU PSPP OpenOffice.org/Calc Gnumeric ROOT

Open source refers to projects that are open to the public and which draw on other projects that are freely available to the general public. ... Freeware is computer software which is: Made available free of charge. ... Proprietary indicates that a party exercises private ownership, control or use over an item of property, usually to the exclusion of other parties. ... GNU PSPP is the name of a computer program used for performing statistical analysis on sampled data. ... OpenOffice. ... For other uses of the term calculus see calculus (disambiguation) Calculus is a branch of mathematics, developed from algebra and geometry, built on two major complementary ideas. ... Gnumeric is a free spreadsheet program that is part of the GNOME desktop. ... Primary and secondary roots in a cotton plant In vascular plants, the root is that organ of a plant body that typically lies below the surface of the soil (compare with stem). ... Microsoft Excel is a spreadsheet program written and distributed by Microsoft for computers using the Microsoft Windows operating system and for Apple Macintosh computers. ...

In statistics, analysis of variance (ANOVA) is a collection of statistical models and their associated procedures which compare means by splitting the overall observed variance into different parts. ... Extreme value theory is a branch of statistics dealing with the extreme deviations from the median of probability distributions. ... In statistics, instrumental variables estimation (IVE) is an extension of linear regression analysis. ... List of associations and societies American Statistical Association Belgian Statistical Society Danish Society For Theoretical Statistics Finnish Statistical Society French Statistical Society German Statistical Society Hong Kong Statistical Society Indian Statistical Institute Institute of Mathematical Statistics International Association for Statistical Education International Biometric Society International Chinese Statistical Association International Environmetrics... National statistical services Australia: Australian Bureau of Statistics Brazil: Brazilian Institute of Geography and Statistics (IBGE) Belgium: Statistics Belgium Canada: Statistics Canada Colombia: Departamento Administrativo Nacional de Estadistica (DANE) Denmark: Danmarks statistik - http://www. ... // Probability The Doctrine of Chances Author: Abraham de Moivre Publication data: 1738 (2nd ed. ... Please add any Wikipedia articles related to statistics that are not already on this list. ... Statisticians or people who made notable contributions to the theories of statistics, or related aspects of probability, or machine learning: Peter Armitage M. S. Bartlett Thomas Bayes Allan Birnbaum David Blackwell Ladislaus Bortkiewicz Pafnuty Chebyshev Alexey Chervonenkis Richard Threlkeld Cox Harald CramÃ©r (Sweden, 1893 - 1985) Philip Dawid Mike Dugas... ... Since around the time of the Napoleonic Wars, statistics have played a major role in civil and social life of most even partially developed countries. ... Multivariate statistics or multivariate statistical analysis in statistics describes a collection of procedures which involve observation and analysis of more than one statistical variable at a time. ... A particular type of statistical significance test which uses observations from a data set to compute a test statistic characterizing a hypothesis of interest, with subsequent comparison of the test statistic with a reference distribution )of the test statistic) which would expected under a null hypothesis, given characteristics of the... Regression analysis is any statistical method where the mean of one or more random variables is predicted conditioned on other (measured) random variables. ... A statistical package is a kind of large computer program that is specialised for statistical analysis. ... This is not an attempt at a comprehensive list of statistical topics; see that article. ...

Results from FactBites:

 Classics in the History of Psychology -- Fisher (1925) Chapter 1 (4826 words) Statistics may be regarded as (i.) the study of populations, (ii.) as the study of variation, (iii.) as the study of methods of the reduction of data. This particular dependence of social studies upon statistical methods has led to the painful misapprehension that statistics is to be regarded as a branch of economics, whereas in truth economists have much to learn from their scientific contemporaries, not only in general scientific method, but in particular in statistical practice. The statistical examination of a body of data is thus logically similar to the general alternation of inductive and deductive methods throughout the sciences.
 NIH Guide: STATISTICAL METHODS IN HIV/AIDS RESEARCH (2078 words) NICHD: Statistical methods and designs for analyzing the determinants of sexual behavior related to HIV risk and the impact of interventions to modify sexual behavior, including techniques to improve the accuracy of self-report data; demographic approaches to modeling the spread and impact of HIV within the population. New statistical methods have been developed to handle such problems as the tracking and prediction of number of AIDS cases, the use and assessment of immunologic and virologic measurements, and the analysis and monitoring of clinical trials with censored and missing data. Statistical methods for assessing the impact of HIV genetic and antigenic variation on the effectiveness of HIV vaccines; 4.
More results at FactBites »

Share your thoughts, questions and commentary here