FACTOID # 30: If Alaska were its own country, it would be the 26th largest in total area, slightly larger than Iran.
 
 Home   Encyclopedia   Statistics   States A-Z   Flags   Maps   FAQ   About 
   
 
WHAT'S NEW
 

SEARCH ALL

FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:

 

 

(* = Graphable)

 

 


Encyclopedia > Core Dump

A core dump is the recorded state of the working memory of a computer program at a specific time, generally when the program has terminated abnormally (crashed).[1] In practice, other key pieces of program state are usually dumped at the same time, including the processor registers, which may include the program counter and stack pointer, memory management information, and other processor and operating system flags and information. The name comes from the once-standard memory technology core memory. Core dumps are often used to diagnose or debug errors in computer programs. This article or section does not adequately cite its references or sources. ... A computer program is a collection of instructions that describe a task, or set of tasks, to be carried out by a computer. ... A crash in computing is a condition where a program (either an application or part of the operating system) stops performing its expected function and also stops responding to other parts of the system. ... A context switch is the computing process of storing and restoring the state (context) of a CPU such that multiple processes can share a single CPU resource. ... In computer architecture, a processor register is a small amount of very fast computer memory used to speed the execution of computer programs by providing quick access to commonly used values—typically, the values being in the midst of a calculation at a given point in time. ... A 16×16 cm area core memory plane of 128×128 bits, i. ... Debugging is a methodical process of finding and reducing the number of bugs, or defects, in a computer program or a piece of electronic hardware thus making it behave as expected. ...


On many operating systems, a fatal error in a program automatically triggers a core dump, and by extension the phrase "to dump core" has come to mean, in many cases, any fatal error, regardless of whether a record of the program memory is created. A fatal error is typically related to computers. ...


The term is used in jargon to indicate any circumstance where large amounts of unedited data are deposited for further examination.

Contents

Background

Before the advent of disk operating systems and the ability to record large data files, core dumps were paper printouts of the contents of memory, typically arranged in columns of octal or hexadecimal numbers (the latter was sometimes called a "hex dump"), together with interpretations of various encodings such as machine language instructions, text strings, or decimal or floating-point numbers. In more recent operating systems, a "core dump" is a file containing the memory image of a particular process, or the memory images of parts of the address space of that process, along with other information such as the values of processor registers. These files can be viewed in a readable text format similar to the older paper printouts as well using the proper tools such as objdump. Disk Operating System (specifically) and disk operating system (generically), most often abbreviated as DOS (not to be confused with the DOS family of disk operating systems for the IBM PC compatible platform), refer to operating system software used in most computers that provides the abstraction and management of secondary storage... The octal numeral system, or oct for short, is the base-8 number system, and uses the digits 0 to 7. ... In mathematics and computer science, hexadecimal, base-16, or simply hex, is a numeral system with a radix, or base, of 16, usually written using the symbols 0–9 and A–F, or a–f. ... A hexdump of a binary executable file. ... A file in a computer system is a stream (sequence) of bits stored as a single unit, typically in a file system on disk or magnetic tape. ... In computing, a process is an instance of a computer program that is being executed. ... The introduction to this article provides insufficient context for those unfamiliar with the subject matter. ... In computer architecture, a processor register is a small amount of very fast computer memory used to speed the execution of computer programs by providing quick access to commonly used values—typically, the values being in the midst of a calculation at a given point in time. ... objdump is a program for displaying various information about object files. ...


Causes of core dumps

A core dump is often a useful tool for a programmer seeking to isolate and identify an error in a computer program. In high-level programming languages, compilers usually generate programs with correct underlying instructions, and errors more frequently arise from more complex logical errors such as accesses to non-existent memory. In practice, these are often buffer overflows, where a programmer allocates too little memory for incoming or computed data, or access to null pointers, a common coding error when an unassigned memory reference variable is accessed.
Manual dumps may be cause with kill -3 or gcore <pid>. kill -3 <pid> dumps a thread dump. A high-level programming language is a programming language that, in comparison to low-level programming languages, may be more abstract, easier to use, or more portable across platforms. ... In the C Programming Language, a null pointer is a special pointer which is guaranteed to compare unequal to a pointer to any object or function. ...


Uses of core dumps

Core dumps are a useful debugging aid in several situations. On early standalone or batch-processing systems, core dumps allowed a user to debug a program without monopolizing the (very expensive) computing facility for debugging. Besides, a printout was more convenient than debugging using switches and lights. On shared computers, whether time-sharing, batch processing, or server systems, core dumps allow off-line debugging of the operating system, so that the system can be back in operation immediately. Core dumps allow a user to save a crash for later or off-site analysis, or comparison with other crashes. For embedded computers, it may be impractical to support debugging on the computer itself, so a dump can be taken for analysis on a different computer. Some operating systems (such as early versions of Unix) did not support attaching debuggers to running processes, so core dumps were necessary to run a debugger on a process's memory contents. Core dumps can be used to capture data freed during dynamic memory allocation and may thus be used to retrieve information from a program that has exited or been closed. In the absence of an interactive debugger, the core dump may be used by an assiduous programmer to determine the error from direct examination. Batch processing is the execution of a series of programs (jobs) on a computer without human interaction, when possible. ... An operating system (OS) is a set of computer programs that manage the hardware and software resources of a computer. ... A router, an example of an embedded system. ... Filiation of Unix and Unix-like systems Unix (officially trademarked as UNIX®) is a computer operating system originally developed in the 1960s and 1970s by a group of AT&T employees at Bell Labs including Ken Thompson, Dennis Ritchie and Douglas McIlroy. ... Debug redirects here. ... In computer science, dynamic memory allocation is the allocation of memory storage for use in a computer program during the runtime of that program. ...


A core dump represents the complete contents of the dumped regions of the address space of the dumped process. Depending on the operating system, the dump may contain few or no data structures to aid interpretation of the memory regions. In these systems, successful interpretation requires that the program or user trying to interpret the dump understands the structure of the program's memory use.


A debugger can use a symbol table (if there is one) to help the programmer interpret dumps, identifying variables symbolically and displaying source code; if the symbol table is not available, less interpretation of the dump is possible, but there might still be enough possible to determine the cause of the problem. There are also special-purpose tools called dump analyzers to analyze dumps. One popular tool that is available on almost all operating systems is the GNU Binutils objdump. In computer science, a symbol table is a data structure used by a language translator such as a compiler or interpreter, where each symbol in a programs source code is associated with information such as location, type and scope level. ... Dump analyzer is a tool which is used for understanding the machine readable core dump file into a more structured format which can be used for understanding the contents of Memory space. ... GNU (pronounced ) is a computer operating system - consisting of a kernel, libraries, system utilities, compilers, and end-user application software - composed entirely of free software. ... The GNU Binutils is a collection of programming tools developed by the Free Software Foundation for the manipulation of object code in various object file formats. ... objdump is a program for displaying various information about object files. ...


On modern Unix-like operating systems, core dump files can be read using the GNU Binutils Binary File Descriptor library, and the GNU Debugger (gdb) and objdump that use this library. This library will supply the raw data for a given address in a memory region from a core dump; it does not know anything about variables or data structures in that memory region, so the application using the library to read the core dump will have to determine the addresses of variables and determine the layout of data structures itself, for example by using the symbol table for the program it's debugging. Diagram of the relationships between several Unix-like systems A Unix-like operating system is one that behaves in a manner similar to a Unix system, while not necessarily conforming to or being certified to any version of the Single UNIX Specification. ... GNU (pronounced ) is a computer operating system - consisting of a kernel, libraries, system utilities, compilers, and end-user application software - composed entirely of free software. ... The GNU Binutils is a collection of programming tools developed by the Free Software Foundation for the manipulation of object code in various object file formats. ... The Binary File Descriptor library, most commonly seen as just BFD, is the GNU projects main mechanism for the portable manipulation of object files in a variety of formats. ... The GNU Debugger, usually called just GDB, is the standard debugger for the GNU software system. ...


Core dumps can be used to save the context (state) of a process at a given state for returning to it later. Highly available systems can be made by transferring core between processors, sometimes via coredump files themselves.


Format of core dump files

In older and simpler operating systems, a process's address space was contiguous, so a core dump file was simply a binary file with the sequence of bytes or words. In modern operating systems, a process address space may have gaps, and share pages with other processes or files, so more elaborate representations are used; they may also include other information about the state of the program at the time of the dump.


In Unix-like systems, core dumps generally use the standard executable image format: a.out in older Unixes, ELF in modern Linux, System V, Solaris, and BSD systems, Mach-O in Mac OS X, etc. Diagram of the relationships between several Unix-like systems A Unix-like operating system is one that behaves in a manner similar to a Unix system, while not necessarily conforming to or being certified to any version of the Single UNIX Specification. ... A file format is a particular way to encode information for storage in a computer file. ... a. ... Filiation of Unix and Unix-like systems Unix (officially trademarked as UNIX®) is a computer operating system originally developed in the 1960s and 1970s by a group of AT&T employees at Bell Labs including Ken Thompson, Dennis Ritchie and Douglas McIlroy. ... In computing, the Executable and Linkable Format (ELF, formerly called Extensible Linking Format) is a common standard file format for executables, object code, shared libraries, and core dumps. ... Linux (IPA pronunciation: ) is a Unix-like computer operating system family. ... It has been suggested that Traditional Unix be merged into this article or section. ... Solaris is a computer operating system developed by Sun Microsystems. ... BSD redirects here; for other uses see BSD (disambiguation). ... Mach-O, short for Mach object file format, is a file format for executables and object code. ... Mac OS X (official IPA pronunciation: ) is a line of proprietary, graphical operating systems developed, marketed, and sold by Apple Inc. ...


Uses in culture

The term is sometimes used on Usenet for a posting that describes what has been happening in the poster's life, especially if it involves emotional stress; the implication is that the material has not been edited or analyzed. See also brain dump. Usenet (USEr NETwork) is a global, distributed Internet discussion system that evolved from a general purpose UUCP network of the same name. ... The introduction to this article provides insufficient context for those unfamiliar with the subject matter. ...


Core dumping is also used to describe a method of test taking in which the test taker writes memorized equations, dates or other information on the back of a test when it is first given out, in order to make sure not to forget it during the test. This behavior often accompanies cramming. In education, certification, counseling, the military, and many other fields, a test or an exam (short for examination) is a tool or technique intended to measure students expression of knowledge, skills and/or abilities. ... For computer memory, see computer storage. ... The word cramming has several meanings, including: Phone fraud in which a telephone company adds expensive services to a users telephone bill without the users consent; or Intensive memorization in a short period of time -- usually done by a student preparing for a school examination. ...


References

  1. ^ UNIX 'core' man page.

See also

In emulation, a savestate is an image of the emulation state at a given moment. ... The introduction to this article provides insufficient context for those unfamiliar with the subject matter. ...

External links

  • "Setting the core dump name schema" "[1]"

Descriptions for the file format: CSH is a three-letter abbreviation with multiple meanings: The United States Army operates combat support hospitals. ... This article is about the Unix shell. ... A segmentation fault (often shortened to segfault) is a particular error condition that can occur during the operation of computer software. ...


  Results from FactBites:
 
core dump - definition of core dump in Encyclopedia (238 words)
A core dump is a file containing the contents of memory at the time a program or computer crashed.
A dump is to transfer the content of the memory verbatim to record the state of the computer.
In modern times, a core dump file can be analyzed by a debugger program to provide such information as what values were assigned to variables, what was allocated on the stack, and (if appropriate debugging information was included by the compiler) what line of the program's source code the crash occurred at.
  More results at FactBites »

 
 

COMMENTARY     


Share your thoughts, questions and commentary here
Your name
Your comments

Want to know more?
Search encyclopedia, statistics and forums:

 


Press Releases |  Feeds | Contact
The Wikipedia article included on this page is licensed under the GFDL.
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m