FACTOID # 5: Minnesota and Connecticut are both in the top 5 in saving money and total tax burden per capita.
 
 Home   Encyclopedia   Statistics   States A-Z   Flags   Maps   FAQ   About 
   
 
WHAT'S NEW
 

SEARCH ALL

FACTS & STATISTICS    Advanced view

Search encyclopedia, statistics and forums:

 

 

(* = Graphable)

 

 


Encyclopedia > Linear operator

In mathematics, a linear transformation (also called linear operator or linear map) is a function between two vector spaces that respects the arithmetical operations addition and scalar multiplication defined on vector spaces, or, in other words, it "preserves linear combinations".

Contents

Definition and first consequences

Formally, if V and W are vector spaces over the same ground field K, we say that f : VW is a linear transformation if for any two vectors x and y in V and any scalar a in K, we have

(additivity)
              (homogeneity).

This is equivalent to saying that f   "preserves linear combinations", i.e., for any vectors x1, ..., xm and scalars a1, ..., am, we have

Occasionally, V and W can be considered as vector spaces over different ground fields, and it is then important to specify which field was used for the definition of "linear". If V and W are considered as spaces over the field K as above, we talk about K-linear maps. For example, the conjugation of complex numbers is an R-linear map CC, but it is not C-linear.


Examples

  • An example of a linear transformation from R3 to R4 is the function defined by
f((x1,x2,x3)) = (x1,2x3,x2,x2)
  • The integral yields a linear map from the space of all real-valued integrable functions on some interval to R
  • Differentiation is a linear transformation from the space of all differentiable functions to the space of all functions.
  • If V and W are finite-dimensional vector spaces over the field F, then functions that map linear transformations f : VW to dimF(W)-by-dimF(V) matrices in the way described in the sequel are themselves linear transformations.

Matrices

If V and W are finite dimensional and bases have been chosen, then every linear transformation from V to W can be represented as a matrix; this is useful because it allows concrete calculations. Conversely, matrices yield examples of linear transformations: if A is a real m-by-n matrix, then the rule f(x) = Ax describes a linear transformation RnRm (see Euclidean space).


Let be a basis for V. Then every vector v in V is uniquely determined by the coefficients in

If f : VW is a linear transformation,

which implies that the function f is entirely determined by the values of


Now let be a basis for W. Then we can represent the values of each f(vj) as

So the function f is entirely determined by the values of ai,j.


If we put these values into an m-by-n matrix M, then we can conveniently use it to compute the value of f for any vector in V. For if we place the values of in an n-by-1 matrix C, we have MC = f(v).


It should be noted that there can be multiple matrices that represent a single linear transformation. This is because the values of the elements of the matrix depend on the bases that are chosen. Similarly, if we are given a matrix, we also need to know the bases that it uses in order to determine what linear transformation it represents.


Forming new linear transformations from given ones

The composition of linear transformations is linear: if f : VW and g : WZ are linear, then so is g o f : VZ.


If f1 : VW and f2 : VW are linear, then so is their sum f1 + f2 (which is defined by (f1 + f2)(x) = f1(x) + f2(x)).


If f : VW is linear and a is an element of the ground field K, then the map af, defined by (af)(x) = a (f(x)), is also linear.


In the finite dimensional case and if bases have been chosen, then the composition of linear maps corresponds to the multiplication of matrices, the addition of linear maps corresponds to the addition of matrices, and the multiplication of linear maps with scalars corresponds to the multiplication of matrices with scalars.


Endomorphisms and automorphisms

A linear transformation f : VV is an endomorphism of V; the set of all such endomorphisms End(V) together with addition, composition and scalar multiplication as defined above forms an associative algebra with identity element over the field K (and in particular a ring). The identity element of this algebra is the identity map id : VV.


A bijective endomorphism of V is called an automorphism of V. The composition of two automorphisms is again an automorphism, and the set of all automorphisms of V forms a group, the automorphism group of V which is denoted by Aut(V) or GL(V).


If V has finite dimension n, then End(V) is isomorphic to the associative algebra of all n by n matrices with entries in K. The automorphism group of V is isomorphic to the general linear group GL(n, K) of all n by n invertible matrices with entries in K.


Kernel and image

If f : VW is linear, we define the kernel and the image of f by

ker(f) is a subspace of V and im(f) is a subspace of W. The following dimension formula is often useful (but note that it only applies if V is finite dimensional):

The number dim(im(f)) is also called the rank of f and written as rk(f). If V and W are finite dimensional, bases have been chosen and f is represented by the matrix A, then the rank of f is equal to the rank of the matrix A. The dimension of the kernel is also known as the nullity of the matrix.


f(x)=a is called a homogeneous equation if a=0, otherwise inhomogeneous. If c is one solution (the so-called "particular solution") then the set of solutions is the set of x that can be written as c plus any solution of the corresponding homogeneous equation (i.e. any element of the kernel). This can be applied e.g. to systems of linear equations, linear recurrence relations, and (systems of) linear differential equations. See e.g. damped, driven harmonic oscillator.


See also

Topics in mathematics related to linear algebra

Edit (http://en.wikipedia.org/w/wiki.phtml?title=MediaWiki:Space&action=edit)

Vectors | Vector spaces | Linear span | Linear transformation | Linear independence | Linear combination | Basis | Column space | Row space | Dual space | Orthogonality | Eigenvector | Eigenvalue | Least squares regressions | Outer product | Cross product | Dot product | Transpose | Matrix decomposition


  Results from FactBites:
 
Linear - Wikipedia, the free encyclopedia (373 words)
Important examples of linear operators include the derivative considered as a differential operator, and many constructed from it, such as del and the Laplacian.
Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (or linear spaces), linear transformations, and systems of linear equations.
In a slightly different usage to the above, a polynomial of degree 1 is said to be linear, because the graph of a function of that form is a line.
  More results at FactBites »

 
 

COMMENTARY     


Share your thoughts, questions and commentary here
Your name
Your comments

Want to know more?
Search encyclopedia, statistics and forums:

 


Press Releases |  Feeds | Contact
The Wikipedia article included on this page is licensed under the GFDL.
Images may be subject to relevant owners' copyright.
All other elements are (c) copyright NationMaster.com 2003-5. All Rights Reserved.
Usage implies agreement with terms, 1022, m