linear algebra
Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (also called linear spaces), linear maps (also called linear transformations), and systems of linear equations. Vector spaces are a central theme in modern mathematics; thus, linear algebra is widely used in both abstract algebra and functional analysis. Linear algebra also has a concrete representation in analytic geometry and it is generalized in operator theory. It has extensive applications in the natural sciences and the social sciences, since nonlinear models can often be approximated by linear ones.
Matrices were poorlydefined before the development of ring theory within abstract algebra. With the coming of special relativity, many practitioners gained appreciation of the subtleties of linear algebra. Furthermore, the routine application of Cramer's rule to solve partial differential equations led to the inclusion of linear algebra in standard coursework at universities. E.T. Copson wrote, for instance,
Francis Galton initiated the use of correlation coefficients in 1888. Often more than one random variable is in play and may be crosscorrelated. In statistical analysis of multivariate random variables the correlation matrix is a natural tool. Thus, statistical study of such random vectors helped establish matrix usage.
Modern linear algebra has been extended to consider spaces of arbitrary or infinite dimension. A vector space of dimension n is called an nspace. Most of the useful results from 2 and 3space can be extended to these higher dimensional spaces. Although many people cannot easily visualize vectors in nspace, such vectors or ntuples are useful in representing data. Since vectors, as ntuples, are ordered lists of n components, it is possible to summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8dimensional vectors or 8tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v_{1}, v_{2}, v_{3}, v_{4}, v_{5}, v_{6}, v_{7}, v_{8}) where each country's GNP is in its respective position.
A vector space (or linear space), as a purely abstract concept about which theorems are proved, is part of abstract algebra, and is well integrated into this discipline. Some striking examples of this are the group of invertible linear maps or matrices, and the ring of linear maps of a vector space. Linear algebra also plays an important part in analysis, notably, in the description of higher order derivatives in vector analysis and the study of tensor products and alternating maps.
In this abstract setting, the scalars with which an element of a vector space can be multiplied need not be numbers. The only requirement is that the scalars form a mathematical structure, called a field. In applications, this field is usually the field of real numbers or the field of complex numbers. Linear maps take elements from a linear space to another (or to itself), in a manner that is compatible with the addition and scalar multiplication given on the vector space(s). The set of all such transformations is itself a vector space. If a basis for a vector space is fixed, every linear transform can be represented by a table of numbers called a matrix. The detailed study of the properties of and algorithms acting on matrices, including determinants and eigenvectors, is considered to be part of linear algebra.
One can say quite simply that the linear problems of mathematics  those that exhibit linearity in their behavior  are those most likely to be solved. For example differential calculus does a great deal with linear approximation to functions. The difference from nonlinear problems is very important in practice.
The general method of finding a linear way to look at a problem, expressing this in terms of linear algebra, and solving it, if need be by matrix calculations, is one of the most generally applicable in mathematics.
In differential geometry, Riemannian geometry is the study of smooth manifolds with Riemannian metrics, i.e.
..... Click the link for more information.
History
The history of modern linear algebra dates back to the early 1840's. In 1843, William Rowan Hamilton introduced quaternions, which describe mechanics in threedimensional space. In 1844, Hermann Grassmann published his book Die lineale Ausdehnungslehre (see References). Arthur Cayley introduced matrices, one of the most fundamental linear algebraic ideas, in 1857. Despite these early developments, linear algebra has been developed primarily in the twentieth century.Matrices were poorlydefined before the development of ring theory within abstract algebra. With the coming of special relativity, many practitioners gained appreciation of the subtleties of linear algebra. Furthermore, the routine application of Cramer's rule to solve partial differential equations led to the inclusion of linear algebra in standard coursework at universities. E.T. Copson wrote, for instance,
When I went to Edinburgh as a young lecturer in 1922, I was surprised to find how different the curriculum was from that at Oxford. It included topics such as Lebesgue integration, matrix theory, numerical analysis, Riemannian geometry, of which I knew nothing...  
—E.T. Copson, Preface to Partial Differential Equations, 1973 
Elementary introduction
Linear algebra had its beginnings in the study of vectors in Cartesian 2space and 3space. A vector, here, is a directed line segment, characterized by both its magnitude, represented by length, and its direction. Vectors can be used to represent physical entities such as forces, and they can be added to each other and multiplied with scalars, thus forming the first example of a real vector space.Modern linear algebra has been extended to consider spaces of arbitrary or infinite dimension. A vector space of dimension n is called an nspace. Most of the useful results from 2 and 3space can be extended to these higher dimensional spaces. Although many people cannot easily visualize vectors in nspace, such vectors or ntuples are useful in representing data. Since vectors, as ntuples, are ordered lists of n components, it is possible to summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8dimensional vectors or 8tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v_{1}, v_{2}, v_{3}, v_{4}, v_{5}, v_{6}, v_{7}, v_{8}) where each country's GNP is in its respective position.
A vector space (or linear space), as a purely abstract concept about which theorems are proved, is part of abstract algebra, and is well integrated into this discipline. Some striking examples of this are the group of invertible linear maps or matrices, and the ring of linear maps of a vector space. Linear algebra also plays an important part in analysis, notably, in the description of higher order derivatives in vector analysis and the study of tensor products and alternating maps.
In this abstract setting, the scalars with which an element of a vector space can be multiplied need not be numbers. The only requirement is that the scalars form a mathematical structure, called a field. In applications, this field is usually the field of real numbers or the field of complex numbers. Linear maps take elements from a linear space to another (or to itself), in a manner that is compatible with the addition and scalar multiplication given on the vector space(s). The set of all such transformations is itself a vector space. If a basis for a vector space is fixed, every linear transform can be represented by a table of numbers called a matrix. The detailed study of the properties of and algorithms acting on matrices, including determinants and eigenvectors, is considered to be part of linear algebra.
One can say quite simply that the linear problems of mathematics  those that exhibit linearity in their behavior  are those most likely to be solved. For example differential calculus does a great deal with linear approximation to functions. The difference from nonlinear problems is very important in practice.
The general method of finding a linear way to look at a problem, expressing this in terms of linear algebra, and solving it, if need be by matrix calculations, is one of the most generally applicable in mathematics.
Some useful theorems
 Every vector space has a basis.^{[1]}
 Any two bases of the same vector space have the same cardinality; equivalently, the dimension of a vector space is welldefined.
 A matrix is invertible if and only if its determinant is nonzero.
 A matrix is invertible if and only if the linear map represented by the matrix is an isomorphism.
 If a square matrix has a left inverse or a right inverse then it is invertible (see invertible matrix for other equivalent statements).
 A matrix is positive semidefinite if and only if each of its eigenvalues is greater than or equal to zero.
 A matrix is positive definite if and only if each of its eigenvalues is greater than zero.
 The spectral theorem (regarding diagonalizable matrices).
Generalisation and related topics
Since linear algebra is a successful theory, its methods have been developed in other parts of mathematics. In module theory one replaces the field of scalars by a ring. In multilinear algebra one deals with the 'several variables' problem of mappings linear in each of a number of different variables, inevitably leading to the tensor concept. In the spectral theory of operators control of infinitedimensional matrices is gained, by applying mathematical analysis in a theory that is not purely algebraic. In all these cases the technical difficulties are much greater.See also
 List of linear algebra topics
 Important publications in linear algebra
 Numerical linear algebra
Note
1. ^ The existence of a basis is straightforward for finitely generated vector spaces, but in full generality it is logically equivalent to the axiom of choice.
References
History
 FearnleySander, Desmond, Hermann Grassmann and the Creation of Linear Algebra, American Mathematical Monthly 86 (1979), pp. 809–817.
 Grassmann, Hermann, Die lineale Ausdehnungslehre ein neuer Zweig der Mathematik: dargestellt und durch Anwendungen auf die übrigen Zweige der Mathematik, wie auch auf die Statik, Mechanik, die Lehre vom Magnetismus und die Krystallonomie erläutert, O. Wigand, Leipzig, 1844.
External links
 MIT Linear Algebra Lectures: free videos from MIT OpenCourseWare
 Streaming MIT Linear Algebra Lectures at Google Video
 Linear Algebra Toolkit.
 Linear Algebra Workbench: multiply and invert matrices, solve systems, eigenvalues etc.
 Linear Algebra on MathWorld.
 Linear Algebra overview and notation summary on PlanetMath.
 Matrix and Linear Algebra Terms on Earliest Known Uses of Some of the Words of Mathematics
 Linear Algebra by Elmer G. Wiens. Interactive web pages for vectors, matrices, linear equations, etc.
 Linear Algebra Solved Problems: Interactive forums for discussion of linear algebra problems, from the lowest up to the hardest level (Putnam).
 Linear Algebra for Informatics. José FigueroaO'Farrill, University of Edinburgh
 Linear Algebra by Jim Hefferon: A free textbook with exercises and a solutions guide written by a professor at Saint Michael's College.
 Online Notes / Linear Algebra Paul Dawkins, Lamar University
 Elementary Linear Algebra textbook with solutions
 http://linear.ups.edu/download/fclaelectric1.20.pdf Linear Algebra Book written by Prof. Robert A. Beezer of University of Puget Sound.
Topics related to linear algebra 

Vectors • Vector spaces • Linear span • Linear transformation • Linear independence • Linear combination • Basis • Column space • Row space • Dual space • Orthogonality • Rank • Minor • Eigenvector • Eigenvalue • Least squares regressions • Outer product • Cross product • Dot product • Transpose • Matrix decomposition 
Major fields of mathematics 

Logic
Set theory
Algebra (Abstract algebra – Linear algebra)
Discrete mathematics
Number theory
Analysis
Geometry
Topology
Applied mathematics
Probability
Statistics
Mathematical physics

Mathematics (colloquially, maths or math) is the body of knowledge centered on such concepts as quantity, structure, space, and change, and also the academic discipline that studies them. Benjamin Peirce called it "the science that draws necessary conclusions".
..... Click the link for more information.
..... Click the link for more information.
spatial vector, or simply vector, is a concept characterized by a magnitude and a direction. A vector can be thought of as an arrow in Euclidean space, drawn from an initial point A pointing to a terminal point B.
..... Click the link for more information.
..... Click the link for more information.
In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. More formally, a vector space is a set on which two operations, called (vector) addition and (scalar) multiplication, are
..... Click the link for more information.
..... Click the link for more information.
In mathematics, a linear map (also called a linear transformation or linear operator) is a function between two vector spaces that preserves the operations of vector addition and scalar multiplication.
..... Click the link for more information.
..... Click the link for more information.
system of linear equations (or linear system) is a collection of linear equations involving the same set of variables. For example,
..... Click the link for more information.
..... Click the link for more information.
Mathematics (colloquially, maths or math) is the body of knowledge centered on such concepts as quantity, structure, space, and change, and also the academic discipline that studies them. Benjamin Peirce called it "the science that draws necessary conclusions".
..... Click the link for more information.
..... Click the link for more information.
Algebra is a branch of mathematics concerning the study of structure, relation and quantity. The name is derived from the treatise written by the Arabic^{[1]} mathematician, astronomer, astrologer and geographer,
..... Click the link for more information.
..... Click the link for more information.
Abstract algebra is the subject area of mathematics that studies algebraic structures, such as groups, rings, fields, modules, vector spaces, and algebras. Most authors nowadays simply write algebra instead of abstract algebra.
..... Click the link for more information.
..... Click the link for more information.
Functional analysis is the branch of mathematics, and specifically of analysis, concerned with the study of vector spaces and operators acting upon them. It has its historical roots in the study of functional spaces, in particular transformations of functions, such as the Fourier
..... Click the link for more information.
..... Click the link for more information.
Analytic geometry, also called coordinate geometry and earlier referred to as Cartesian geometry or analytical geometry, is the study of geometry using the principles of algebra.
..... Click the link for more information.
..... Click the link for more information.
In mathematics, operator theory is the branch of functional analysis which deals with bounded linear operators and their properties. It can be split crudely into two branches, although there is considerable overlap and interplay between them.
..... Click the link for more information.
..... Click the link for more information.
natural science refers to a rational approach to the study of the universe, which is understood as obeying rules or laws of natural origin. The term natural science
..... Click the link for more information.
..... Click the link for more information.
The social sciences are a group of academic disciplines that study human aspects of the world. They diverge from the arts and humanities in that the social sciences tend to emphasize the use of the scientific method in the study of humanity, including quantitative and qualitative
..... Click the link for more information.
..... Click the link for more information.
William Hamilton
William Rowan Hamilton
Born July 4 1805
Dublin, Ireland
Died September 2 1865 (aged 60)
..... Click the link for more information.
William Rowan Hamilton
Born July 4 1805
Dublin, Ireland
Died September 2 1865 (aged 60)
..... Click the link for more information.
quaternions are a noncommutative extension of complex numbers. They were first described by the Irish mathematician, Sir William Rowan Hamilton, in 1843 and applied to mechanics in threedimensional space.
..... Click the link for more information.
..... Click the link for more information.
Hermann Günther Grassmann (April 15, 1809, Stettin – September 26, 1877, Stettin) was a German polymath, renowned in his day as a linguist and now admired as a mathematician. He was also a physicist, neohumanist, allround scholar, and publisher.
..... Click the link for more information.
..... Click the link for more information.
Arthur Cayley
Arthur Cayley
Born July 16 1821
Richmond, Surrey, UK
Died January 26 1895 (aged 75)
Cambridge, UK
..... Click the link for more information.
Arthur Cayley
Born July 16 1821
Richmond, Surrey, UK
Died January 26 1895 (aged 75)
Cambridge, UK
..... Click the link for more information.
matrix (plural matrices) is a rectangular table of elements (or entries), which may be numbers or, more generally, any abstract quantities that can be added and multiplied.
..... Click the link for more information.
..... Click the link for more information.
In mathematics, ring theory is the study of rings, algebraic structures in which addition and multiplication are defined and have similar properties to those familiar from the integers.
..... Click the link for more information.
..... Click the link for more information.
Abstract algebra is the subject area of mathematics that studies algebraic structures, such as groups, rings, fields, modules, vector spaces, and algebras. Most authors nowadays simply write algebra instead of abstract algebra.
..... Click the link for more information.
..... Click the link for more information.
special theory of relativity was proposed in 1905 by Albert Einstein in his article "On the Electrodynamics of Moving Bodies". Some three centuries earlier, Galileo's principle of relativity had stated that all uniform motion was relative, and that there was no absolute and
..... Click the link for more information.
..... Click the link for more information.
Cramer's rule is a theorem in linear algebra, which gives the solution of a system of linear equations in terms of determinants. It is named after Gabriel Cramer (1704  1752).
..... Click the link for more information.
..... Click the link for more information.
In mathematics, a partial differential equation (PDE) is a type of differential equation, i. e. a relation involving an unknown function of several independent variables and its partial derivatives with respect to those variables.
..... Click the link for more information.
..... Click the link for more information.
Lebesgue integration is a mathematical construction that extends the integral to a larger class of functions; it also extends the domains on which these functions can be defined.
..... Click the link for more information.
..... Click the link for more information.
Matrix theory is a branch of mathematics which focuses on the study of matrices. Initially a subbranch of linear algebra, it has grown to cover subjects related to graph theory, algebra, combinatorics, and statistics as well.
..... Click the link for more information.
..... Click the link for more information.
Numerical analysis is the study of algorithms for the problems of continuous mathematics (as distinguished from discrete mathematics).
One of the earliest mathematical writing is the Babylonian tablet YBC 7289, which gives a sexagesimal numerical approximation of ,
..... Click the link for more information.
One of the earliest mathematical writing is the Babylonian tablet YBC 7289, which gives a sexagesimal numerical approximation of ,
..... Click the link for more information.
 Elliptic geometry is also sometimes called Riemannian geometry.
In differential geometry, Riemannian geometry is the study of smooth manifolds with Riemannian metrics, i.e.
..... Click the link for more information.
Francis Galton
Francis Galton
Born January 16 1822
Birmingham, England
Died January 17 1911 (aged 90)
..... Click the link for more information.
Francis Galton
Born January 16 1822
Birmingham, England
Died January 17 1911 (aged 90)
..... Click the link for more information.
correlation, also called correlation coefficient, indicates the strength and direction of a linear relationship between two random variables. In general statistical usage, correlation or corelation refers to the departure of two variables from independence.
..... Click the link for more information.
..... Click the link for more information.
A random variable is an abstraction of the intuitive concept of chance into the theoretical domains of mathematics, forming the foundations of probability theory and mathematical statistics.
..... Click the link for more information.
..... Click the link for more information.
This article is copied from an article on Wikipedia.org  the free encyclopedia created and edited by online user community. The text was not checked or edited by anyone on our staff. Although the vast majority of the wikipedia encyclopedia articles provide accurate and timely information please do not assume the accuracy of any particular article. This article is distributed under the terms of GNU Free Documentation License.