FACTOID # 4: Just 1% of the houses in Nevada were built before 1939.
 
 Home   Encyclopedia   Statistics   States A-Z   Flags   Maps   FAQ   About 
   
 
WHAT'S NEW
 

SEARCH ALL

FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:

 

 

(* = Graphable)

 

 


Encyclopedia > Data Analysis

Data analysis is the act of transforming data with the aim of extracting useful information and facilitating conclusions. Depending on the type of data and the question, this might include application of statistical methods, curve fitting, selecting or discarding certain subsets based on specific criteria, or other techniques. In contrast to Data mining, data analysis is usually more narrowly intended as not aiming to the discovery of unforeseen patterns hidden in the data, but to the verification or disproval of an existing model, or to the extraction of parameters necessary to adapt a theoretical model to (experimental) reality. For other uses, see Data (disambiguation). ... The ASCII codes for the word Wikipedia represented in binary, the numeral system most commonly used for encoding computer information. ... In logic, a conclusion is a proposition inferred from premises. ... A graph of a normal bell curve showing statistics used in educational assessment and comparing various grading methods. ... Curve fitting is finding a curve which matches a series of data points and possibly other constraints. ... Criterion DVD Series The Criterion Collection, a continuing series of important classic and contemporary films, is dedicated to gathering the greatest films from around the world and publishing them in editions that offer the highest technical quality and award-winning, original supplements. ... Data mining has been defined as the nontrivial extraction of implicit, previously unknown, and potentially useful information from data [1] and the science of extracting useful information from large data sets or databases [2]. Data mining involves sorting through large amounts of data and picking out relevant information. ...

Contents

Applications in various fields

Data analysis assumes different aspects, and possibly different names, in different fields.


Nuclear and particle physics

In nuclear and particle physics the data usually originate from the experimental apparatus via a Data acquisition system. It is then processed, in a step usually called data reduction, to apply calibrations and to extract physically significant information. Data reduction is most often, especially in large particle physics experiments, an automatic, batch-mode operation carried out by software written ad-hoc. The resulting data n-tuples are then scrutinized by the physicists, using specialized software tools like ROOT or PAW, comparing the results of the experiment with theory. Nuclear physics is the branch of physics concerned with the nucleus of the atom. ... Thousands of particles explode from the collision point of two relativistic (100 GeV per ion) gold ions in the STAR detector of the Relativistic Heavy Ion Collider. ... The Compact Muon Solenoid (CMS) is an example of a large particle detector. ... Data acquisition is the sampling of the real world to generate data that can be manipulated by a computer. ... ROOT is an object-oriented software package developed by CERN. It was originally designed for particle physics data analysis and contains several features specific to this field, but it is also commonly used in other applications such as astronomy and data mining. ... Physics Analysis Workstation (PAW) is an interactive, scriptable tool for analysis and graphical presentation of data in high energy physics. ...


The theoretical models are often difficult to compare directly with the results of the experiments, so they are used instead as input for Monte Carlo simulation software like Geant4 that predict the response of the detector to a given theoretical event, producing simulated events which are then compared to experimental data. Monte Carlo methods are a widely used class of computational algorithms for simulating the behavior of various physical and mathematical systems, and for other computations. ... Geant4 (for GEometry ANd Tracking) is a platform for the simulation of the passage of particles through matter, using Monte Carlo methods. ...


See also Computational physics. Computational physics is the study and implementation of numerical algorithms in order to solve problems in physics for which a quantitative theory already exists. ...


Software tools for Physics Data Analysis

AIDA is a set of defined interfaces and formats for representing common data analysis objects. ... CINT is a command line C/C++ interpreter that is included in the object oriented histogramming package ROOT. Although intended for use with the other faculties of ROOT, CINT can also be used as a standalone addition to another program that requires such an interpreter. ... HippoDraw is a powerful object oriented statistical data analysis package written in C++, with user interaction via a Qt-based GUI and a Python scriptable interface. ... Java Analysis Studio (JAS) is an object oriented data analysis package developed for the analysis of particle physics data. ... MATLAB is a numerical computing environment and programming language. ... Physics Analysis Workstation (PAW) is an interactive, scriptable tool for analysis and graphical presentation of data in high energy physics. ... ROOT is an object-oriented software package developed by CERN. It was originally designed for particle physics data analysis and contains several features specific to this field, but it is also commonly used in other applications such as astronomy and data mining. ... Screenshot of Mac OS X RAqua desktop The R programming language, sometimes described as GNU S, is a programming language and software environment for statistical computing and graphics. ...

Social sciences

Qualitative data analysis (QDA) or qualitative research is the analysis of non-numerical data, for example words, photographs, observations, etc.. Qualitative research is one of the two major approaches to research methodology in social sciences. ... Qualitative research is one of the two major approaches to research methodology in social sciences. ...


Information technology

A special case is the data analysis in information technology audits.


Business

See

Analytics is the branch of logic dealing with analysis. ... Business intelligence (BI) is a business management term, which refers to applications and technologies that are used to gather, provide access to, and analyze data and information about company operations. ... Data mining has been defined as the nontrivial extraction of implicit, previously unknown, and potentially useful information from data [1] and the science of extracting useful information from large data sets or databases [2]. Data mining involves sorting through large amounts of data and picking out relevant information. ...

Software tools for Business Data Analysis

yhfuygfyufuyiyfyfyfytftd ACL is a global leader in data analytics technology or Computer Aided Audit Tools and Techniques (CAATTS), one of the fastest growing fields within the Audit Profession. ... Crystal Reports is a Business Intelligence application used to design and generate reports from a wide range of data sources. ... Hyperion Solutions Corporation is a business performance management software company, located in Santa Clara, California, USA. Many of its products are targeted at the Business Intelligence and Business performance management market. ... The slash A solidus, oblique or slash, /, is a punctuation mark. ... The SAS System, originally Statistical Analysis System, is an integrated system of software products provided by SAS Institute that enables the programmer to perform: data entry, retrieval, management, and mining report writing and graphics statistical and mathematical analysis business planning, forecasting, and decision support operations research and project management quality... JMP is a computer program that was first developed by John Sall to perform simple and complex statistical analyses. ... RapidMiner (formerly YALE) is an environment for machine learning and data mining experiments. ...


See also

In statistics, censoring occurs when the value of an observation is only partially known. ... Data acquisition is the sampling of the real world to generate data that can be manipulated by a computer. ... Data governance encompasses the people, processes and procedures required to create a consistent, enterprise view of a companys data in order to: Increase consistency & confidence in decision making Decrease the risk of regulatory fines Improve data security Data Governance initiatives improve data quality by assigning a team responsible solely... Data mining has been defined as the nontrivial extraction of implicit, previously unknown, and potentially useful information from data [1] and the science of extracting useful information from large data sets or databases [2]. Data mining involves sorting through large amounts of data and picking out relevant information. ... Exploratory data analysis (EDA) is that part of statistical practice concerned with reviewing, communicating and using data where there is a low level of knowledge about its cause system. ... Predictive analytics encompasses a variety of techniques from statistics and data mining that process current and historical data in order to make “predictions” about future events. ... Qualitative research is one of the two major approaches to research methodology in social sciences. ... Scientific computing (or computational science) is the field of study concerned with constructing mathematical models and numerical solution techniques and using computers to analyze and solve scientific and engineering problems. ... A test method is a definitive procedure that produces a test result. ...

Further reading

  • Michael S. Lewis-Beck, Data Analysis: an Introduction, Sage Publications Inc, 1995, ISBN 0803957726
  • Pyzdek, T, "Quality Engineering Handbook", 2003, ISBN 0824746147
  • Godfrey, A. B., "Juran's Quality Handbook", 1999, ISBN 007034003
  • "Engineering Statistics Handbook", NIST/SEMATEK, [1]
  • "Manual on Presentation of Data and Control Chart Analysis", ASTM MNL 7, 1990, ISBN:0-8031-1189-0

  Results from FactBites:
 
Data analysis - Wikipedia, the free encyclopedia (308 words)
Data analysis is the act of transforming data with the aim of extracting useful information and facilitating conclusions.
Respect to Data mining, data analysis is usually more narrowly intended as not aiming to the discovery of unforeseen patterns hidden in the data, but to the verification or disproval of an existing model, or to the extraction of parameters necessary to adapt a theoretical model to (experimental) reality.
Qualitative Data Analysis (QDA) or qualitative research is the analysis of non-numerical data, for example words, photographs, observations, etc..
Inferring From Data (16052 words)
It is also an appropriate distribution for describing data corresponding to resonance behavior, such as the variation with energy of the cross section of a nuclear reaction or the variation with velocity of the absorption of radiation in the Mossbauer effect.
Multivariate analysis is a branch of statistics involving the consideration of objects on each of which are observed the values of a number of variables.
The data are measured and /or analyzed using a nominal scale of measurement.
  More results at FactBites »

 
 

COMMENTARY     


Share your thoughts, questions and commentary here
Your name
Your comments

Want to know more?
Search encyclopedia, statistics and forums:

 


Press Releases |  Feeds | Contact
The Wikipedia article included on this page is licensed under the GFDL.
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m